Gradient-free method for nonsmooth distributed optimization

@article{Li2015GradientfreeMF,
  title={Gradient-free method for nonsmooth distributed optimization},
  author={Jueyou Li and Chi-haur Wu and Zhiyou Wu and Qiang Long},
  journal={Journal of Global Optimization},
  year={2015},
  volume={61},
  pages={325-340}
}
In this paper, we consider a distributed nonsmooth optimization problem over a computational multi-agent network. We first extend the (centralized) Nesterov’s random gradient-free algorithm and Gaussian smoothing technique to the distributed case. Then, the convergence of the algorithm is proved. Furthermore, an explicit convergence rate is given in terms of the network size and topology. Our proposed method is free of gradient, which may be preferred by practical engineers. Since only the cost… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 16 CITATIONS

Distributed Stochastic Approximation Algorithm With Expanding Truncations

VIEW 5 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

Exact Convergence of Gradient-Free Distributed Optimization Method in a Multi-Agent System

  • 2018 IEEE Conference on Decision and Control (CDC)
  • 2018
VIEW 1 EXCERPT
CITES BACKGROUND