# Stepsize

National Institutes of Health

## Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2019
2019
• Mathematical Methods in the Applied Sciences
• 2019
• Corpus ID: 118627885
In this paper, we introduce two golden ratio algorithms with new stepsize rules for solving pseudomonotone and Lipschitz…
2019
2019
• IEEE Transactions on Neural Networks and Learning…
• 2019
• Corpus ID: 52069649
In this brief, future equality-constrained quadratic programming (FECQP) is studied. Via a zeroing neurodynamics method, a…
2016
2016
• IEEE Transactions on Neural Networks and Learning…
• 2016
• Corpus ID: 10509873
Complex gradient methods have been widely used in learning theory, and typically aim to optimize real-valued functions of complex…
2016
2016
• Huizhen Yu
• J. Mach. Learn. Res.
• 2016
• Corpus ID: 11532785
We consider the emphatic temporal-difference (TD) algorithm, ETD($\lambda$), for learning the value functions of stationary…
2011
2011
• 2011
• Corpus ID: 54776530
This paper analyzes the emergence of systemic risk in a network model of interconnected bank balance sheets. Given a shock to…
2008
2008
• 2008
• Corpus ID: 18533134
Abstract Conjugate gradient methods are efficient methods for minimizing differentiable objective functions in large dimension…
2001
2001
• SIAM J. Optim.
• 2001
• Corpus ID: 15050238
This paper presents a convergence proof technique for a broad class of proximal algorithms in which the perturbation term is…
1994
1994
There is considerable evidence suggesting that for Hamiltonian systems of ordinary differential equations it is better to use…
1986
1986
A systematic way of extending a general fixed-stepsize multistep formula to a minimum storage variable-stepsize formula has been…
Highly Cited
1982
Highly Cited
1982
This paper outlines a number of difficulties which can arise when numerical methods are used to solve systems of differential…