Skip to search formSkip to main contentSkip to account menu

Stochastic optimization

Known as: Stochastic optimisation, Stochastic search 
Stochastic optimization (SO) methods are optimization methods that generate and use random variables. For stochastic problems, the random variables… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2014
Highly Cited
2014
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive… 
Highly Cited
2014
Highly Cited
2014
Stochastic gradient descent (SGD) is a popular technique for large-scale optimization problems in machine learning. In order to… 
Highly Cited
2012
Highly Cited
2012
This paper evaluates the real-time price-based demand response (DR) management for residential appliances via stochastic… 
Highly Cited
2011
Highly Cited
2011
We present a new family of subgradient methods that dynamically incorporate knowledge of the geometry of the data observed in… 
Highly Cited
2011
Highly Cited
2011
We analyze the convergence of gradient-based optimization algorithms that base their updates on delayed stochastic gradient… 
Highly Cited
2011
Highly Cited
2011
We present a new approach to motion planning using a stochastic trajectory optimization framework. The approach relies on… 
Highly Cited
2011
Highly Cited
2011
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization problems which arise in machine… 
Highly Cited
2008
Highly Cited
2008
In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic… 
Review
2003
Review
2003
From the Publisher: * Unique in its survey of the range of topics. * Contains a strong, interdisciplinary format that will… 
Highly Cited
1991
Highly Cited
1991
This paper presents a methodology for the solution of multistage stochastic optimization problems, based on the approximation of…