Skip to search formSkip to main contentSkip to account menuSemantic Scholar Semantic Scholar's Logo Search 205,508,249 papers from all fields of science

Search

Semantic Scholar uses AI to extract papers important to this topic.

2019

2019

In this paper, we introduce two golden ratio algorithms with new stepsize rules for solving pseudomonotone and Lipschitz…

2019

2019

In this brief, future equality-constrained quadratic programming (FECQP) is studied. Via a zeroing neurodynamics method, a…

2016

2016

Complex gradient methods have been widely used in learning theory, and typically aim to optimize real-valued functions of complex…

2016

2016

We consider the emphatic temporal-difference (TD) algorithm, ETD($\lambda$), for learning the value functions of stationary…

2011

2011

This paper analyzes the emergence of systemic risk in a network model of interconnected bank balance sheets. Given a shock to…

2008

2008

Abstract
Conjugate gradient methods are efficient methods for minimizing differentiable objective functions in large dimension…

2001

2001

This paper presents a convergence proof technique for a broad class of proximal algorithms in which the perturbation term is…

1994

1994

There is considerable evidence suggesting that for Hamiltonian systems of ordinary differential equations it is better to use…

1986

1986

A systematic way of extending a general fixed-stepsize multistep formula to a minimum storage variable-stepsize formula has been…

Highly Cited

1982

Highly Cited

1982

This paper outlines a number of difficulties which can arise when numerical methods are used to solve systems of differential…