Skip to search formSkip to main contentSkip to account menu

Truncated Newton method

Known as: Hessian-free optimization 
Truncated Newton methods, also known as Hessian-free optimization, are a family of optimization algorithms designed for optimizing non-linear… 
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2017
Highly Cited
2017
Full waveform inversion (FWI) is a powerful method for reconstructing subsurface parameters from local measurements of the… 
Highly Cited
2014
Highly Cited
2014
Full waveform inversion is a powerful tool for quantitative seismic imaging from wide‐azimuth seismic data. The method is based… 
Highly Cited
2013
Highly Cited
2013
Deep and recurrent neural networks (DNNs and RNNs respectively) are powerful models that were considered to be almost impossible… 
Highly Cited
2013
Highly Cited
2013
Full waveform inversion (FWI) is a powerful method for reconstructing subsurface parameters from local measurements of the… 
Highly Cited
2011
Highly Cited
2011
Recurrent Neural Networks (RNNs) are very powerful sequence models that do not enjoy widespread use because it is extremely… 
Highly Cited
2011
Highly Cited
2011
In this work we resolve the long-outstanding problem of how to effectively train recurrent neural networks (RNNs) on complex and… 
Highly Cited
2010
Highly Cited
2010
We develop a 2nd-order optimization method based on the "Hessian-free" approach, and apply it to training deep auto-encoders… 
Highly Cited
1993
Highly Cited
1993
Regularization algorithms are often used to produce reasonable solutions to ill-posed problems. The L-curve is a plot—for all… 
Highly Cited
1989
Highly Cited
1989
In this paper, an unconstrained minimization algorithm is defined in which a nonmonotone line search technique is employed in… 
Highly Cited
1987
Highly Cited
1987
Techniques from numerical analysis and crystallographic refinement have been combined to produce a variant of the Truncated…