• Corpus ID: 239016561

Fast Projection onto the Capped Simplex withApplications to Sparse Regression in Bioinformatics

@article{Ang2021FastPO,
  title={Fast Projection onto the Capped Simplex withApplications to Sparse Regression in Bioinformatics},
  author={Andersen Man Shun Ang and Jianzhu Ma and Nianjun Liu and Kun Huang and Yijie Wang},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.08471}
}
We consider the problem of projecting a vector onto the so-called k-capped simplex, which is a hyper-cube cut by a hyperplane. For an n-dimensional input vector with bounded elements, we found that a simple algorithm based on Newton’s method is able to solve the projection problem to high precision with a complexity roughly about O(n), which has a much lower computational cost compared with the existing sorting-based methods proposed in the literature. We provide a theory for partial… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 21 REFERENCES
Efficient projections onto the l1-ball for learning in high dimensions
TLDR
Efficient algorithms for projecting a vector onto the l1-ball are described and variants of stochastic gradient projection methods augmented with these efficient projection procedures outperform interior point methods, which are considered state-of-the-art optimization techniques.
Optimizing Costly Functions with Simple Constraints: A Limited-Memory Projected Quasi-Newton Algorithm
TLDR
An optimization algorithm for minimizing a smooth function over a convex set by minimizing a diagonal plus lowrank quadratic approximation to the function, which substantially improves on state-of-the-art methods for problems such as learning the structure of Gaussian graphical models and Markov random elds.
Sparse high-dimensional regression: Exact scalable algorithms and phase transitions
We present a novel binary convex reformulation of the sparse regression problem that constitutes a new duality perspective. We devise a new cutting plane method and provide evidence that it can solve
Projection onto the capped simplex
We provide a simple and efficient algorithm for computing the Euclidean projection of a point onto the capped simplex, formally defined as min x∈RD 1 2 kx − yk 2 s.t. x ⊤ 1 = s, 0 ≤ x ≤ 1, together
Sparse learning via Boolean relaxations
TLDR
Novel relaxations for cardinality-constrained learning problems, including least-squares regression as a special but important case, are introduced, and it is shown that randomization based on the relaxed solution offers a principled way to generate provably good feasible solutions.
Sparse Regression: Scalable Algorithms and Empirical Performance
TLDR
Accuracy, false detection and computational time provide a comprehensive assessment of each feature selection method and shed light on alternatives to the Lasso-regularization which are not as popular in practice yet.
Projected Newton-type methods in machine learning
TLDR
An algorithmic framework for projected Newton- type methods for solving large-scale optimization problems arising in machine learning and related fields is introduced and how to apply the Newton-type framework to handle non-smooth objectives is shown.
Regularization Paths for Generalized Linear Models via Coordinate Descent.
TLDR
In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.
Evaluation of the lasso and the elastic net in genome-wide association studies
TLDR
It is concluded that it is important to analyze GWAS data with both the lasso and the elastic net and an alternative tuning criterion to minimum MSE is needed for variable selection.
Optimization Problems with Perturbations: A Guided Tour
TLDR
The emphasis on methods based on upper and lower estimates of the objective function of the perturbed problems allow one to compute expansions of the optimal value function and approximate optimal solutions in situations where the set of Lagrange multipliers is not a singleton, may be unbounded, or is even empty.
...
1
2
3
...