Boosting with structural sparsity: A differential inclusion approach

@article{Huang2017BoostingWS,
  title={Boosting with structural sparsity: A differential inclusion approach},
  author={Chendi Huang and Xinwei Sun and Jiechao Xiong and Y. Yao},
  journal={Applied and Computational Harmonic Analysis},
  year={2017},
  volume={48},
  pages={1-45}
}
  • Chendi Huang, Xinwei Sun, +1 author Y. Yao
  • Published 2017
  • Mathematics
  • Applied and Computational Harmonic Analysis
  • Abstract Boosting as gradient descent algorithms is one popular method in machine learning. In this paper a novel Boosting-type algorithm is proposed based on restricted gradient descent with structural sparsity control whose underlying dynamics are governed by differential inclusions. In particular, we present an iterative regularization path with structural sparsity where the parameter is sparse under some linear transforms, based on variable splitting and the Linearized Bregman Iteration… CONTINUE READING

    Figures and Tables from this paper.

    Parsimonious Deep Learning: A Differential Inclusion Approach with Global Convergence
    • 1
    • PDF
    DessiLBI: Exploring Structural Sparsity of Deep Networks via Differential Inclusion Paths
    • 1
    • PDF
    S2-LBI: Stochastic Split Linearized Bregman Iterations for Parsimonious Deep Learning
    iSplit LBI: Individualized Partial Ranking with Ties via Split LBI

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 33 REFERENCES
    Split LBI: An Iterative Regularization Path with Structural Sparsity
    • 11
    • PDF
    Boosting With the L2 Loss
    • 464
    • PDF
    Sparsity and smoothness via the fused lasso
    • 1,976
    • Highly Influential
    • PDF
    On Early Stopping in Gradient Descent Learning
    • 414
    • PDF
    An Augmented ADMM Algorithm With Application to the Generalized Lasso Problem
    • 43
    • Highly Influential
    • PDF
    Split Bregman method for large scale fused Lasso
    • 95
    • Highly Influential
    • PDF
    Robust Sparse Analysis Regularization
    • 124
    • Highly Influential
    • PDF
    Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso)
    • 1,034
    • PDF
    The Adaptive Lasso and Its Oracle Properties
    • 4,253
    • PDF
    Sparse recovery via differential inclusions
    • 37
    • PDF