Skip to search formSkip to main contentSkip to account menu

Rate of convergence

Known as: R-linear, Q-linear, Speed of convergence 
In numerical analysis, the speed at which a convergent sequence approaches its limit is called the rate of convergence. Although strictly speaking, a… 
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2020
Highly Cited
2020
Federated learning enables a large amount of edge computing devices to jointly learn a model without data sharing. As a leading… 
Highly Cited
2012
Highly Cited
2012
Alternating direction methods (ADMs) have been well studied in the literature, and they have found many efficient applications in… 
Highly Cited
2012
Highly Cited
2012
We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly… 
Highly Cited
2011
Highly Cited
2011
We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using proximal… 
Highly Cited
2010
Highly Cited
2010
We present a new family of subgradient methods that dynamically incorporate knowledge of the geometry of the data observed in… 
Highly Cited
2009
Highly Cited
2009
  • Xin-She Yang, S. Deb
  • World Congress on Nature & Biologically Inspired…
  • 2009
  • Corpus ID: 206491725
In this paper, we intend to formulate a new meta-heuristic algorithm, called Cuckoo Search (CS), for solving optimization… 
Highly Cited
2004
Highly Cited
2004
We propose a prox-type method with efficiency estimate $O(\epsilon^{-1})$ for approximating saddle points of convex-concave C$^{1… 
Highly Cited
2001
Highly Cited
2001
It is well known that the analysis of the large-time asymptotics of Fokker-Planck type equations by the entropy method is closely… 
Highly Cited
1993
Highly Cited
1993
Highly Cited
1986
Highly Cited
1986
The annealing algorithm is a stochastic optimization method which has attracted attention because of its success with certain…