Corpus ID: 199000989

Nonconvex Zeroth-Order Stochastic ADMM Methods with Lower Function Query Complexity

@article{Huang2019NonconvexZS,
  title={Nonconvex Zeroth-Order Stochastic ADMM Methods with Lower Function Query Complexity},
  author={F. Huang and Shangqian Gao and J. Pei and Heng Huang},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.13463}
}
  • F. Huang, Shangqian Gao, +1 author Heng Huang
  • Published 2019
  • Mathematics, Computer Science
  • ArXiv
  • Zeroth-order (gradient-free) method is a class of powerful optimization tool for many machine learning problems because it only needs function values (not gradient) in the optimization. In particular, zeroth-order method is very suitable for many complex problems such as black-box attacks and bandit feedback, whose explicit gradients are difficult or infeasible to obtain. Recently, although many zeroth-order methods have been developed, these approaches still exist two main drawbacks: 1) high… CONTINUE READING
    5 Citations
    Accelerated Zeroth-Order Momentum Methods from Mini to Minimax Optimization
    • 6
    SpiderBoost and Momentum: Faster Stochastic Variance Reduction Algorithms
    • 18
    • PDF
    Accelerated Stochastic Gradient-free and Projection-free Methods
    • 2
    • PDF
    Discrete Model Compression With Resource Constraint for Deep Neural Networks
    • 4
    • PDF
    Faster Stochastic Quasi-Newton Methods
    • PDF

    References

    SHOWING 1-10 OF 40 REFERENCES
    Zeroth-Order Stochastic Alternating Direction Method of Multipliers for Nonconvex Nonsmooth Optimization
    • 4
    • PDF
    Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization
    • 8
    • PDF
    Stochastic Zeroth-order Optimization via Variance Reduction method
    • 11
    • PDF
    Improved Zeroth-Order Variance Reduced Algorithms and Analysis for Nonconvex Optimization
    • 21
    • Highly Influential
    • PDF
    Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization
    • 53
    • Highly Influential
    • PDF
    Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization
    • 11
    • PDF
    Stochastic Alternating Direction Method of Multipliers
    • 213
    • PDF
    Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
    • 228
    • PDF
    Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
    • 646
    • Highly Influential
    • PDF
    SpiderBoost: A Class of Faster Variance-reduced Algorithms for Nonconvex Optimization
    • 61