Skip to search formSkip to main contentSkip to account menu

Optimal control

Known as: Optimal control theory, Mathematical theory of optimal control, Optimal controller 
Optimal control theory, an extension of the calculus of variations, is a mathematical optimization method for deriving control policies. The method… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2007
Highly Cited
2007
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal… 
Highly Cited
1997
Highly Cited
1997
Preface.- Basic notations.- Outline of the main ideas on a model problem.- Continuous viscosity solutions of Hamilton-Jacobi… 
Highly Cited
1995
Highly Cited
1995
The leading and most up-to-date textbook on the far-ranging algorithmic methododogy of Dynamic Programming, which can be used for… 
Highly Cited
1990
Highly Cited
1990
This augmented edition of a respected text teaches the reader how to use linear quadratic Gaussian methods effectively for the… 
Highly Cited
1987
Highly Cited
1987
Preface 1. Introduction to nonlinear programming 2. Large, sparse nonlinear programming 3. Optimal control preliminaries 4. The… 
Highly Cited
1979
Highly Cited
1979
This best-selling text focuses on the analysis and design of complicated dynamics systems. CHOICE called it "a high-level… 
Review
1975
Review
1975
I The Simplest Problem in Calculus of Variations.- 1. Introduction.- 2. Minimum Problems on an Abstract Space-Elementary Theory… 
Review
1972
Review
1972
Linear Optimal Control SystemsFeedback Control TheoryOptimal ControlLinear Optimal ControlOptimal Control SystemsThe Zeros of… 
Review
1967
Review
1967
Abstract : This complete and authoritative presentation of the current status of control theory offers a useful foundation for… 
Highly Cited
1960
Highly Cited
1960
THIS is one of the two ground-breaking papers by Kalman that appeared in 1960—with the other one (discussed next) being the…