Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,217,199 papers from all fields of science
Search
Sign In
Create Free Account
Dynamic programming
Known as:
Dynamic optimization
, Dynamic programming/Implementations and Examples
, DP
Expand
In mathematics, management science, economics, computer science, and bioinformatics, dynamic programming (also known as dynamic optimization) is a…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
50 relations
Algorithm design
Anytime algorithm
BLAT
Backpropagation
Expand
Broader (2)
Mathematical optimization
Systems engineering
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2006
Highly Cited
2006
Relaxing dynamic programming
B. Lincoln
,
A. Rantzer
IEEE Transactions on Automatic Control
2006
Corpus ID: 6005049
The idea of dynamic programming is general and very simple, but the "curse of dimensionality" is often prohibitive and restricts…
Expand
Highly Cited
2005
Highly Cited
2005
Assessing the Potential of Predictive Control for Hybrid Vehicle Powertrains Using Stochastic Dynamic Programming
Lars Johannesson Mårdh
,
Mattias Asbogard
,
B. Egardt
IEEE transactions on intelligent transportation…
2005
Corpus ID: 6881098
The potential for reduced fuel consumption of hybrid electric vehicles by the use of predictive powertrain control was assessed…
Expand
Highly Cited
2005
Highly Cited
2005
Finding optimal Bayesian networks by dynamic programming
A. P. Singh
,
A. Moore
2005
Corpus ID: 169702
Finding the Bayesian network that maximizes a score function is known as structure learning or structure discovery. Most…
Expand
Highly Cited
2002
Highly Cited
2002
Performance analysis of a dynamic programming track before detect algorithm
L. Johnston
,
V. Krishnamurthy
2002
Corpus ID: 12377830
We analyze a dynamic programming (DP)-based track before detect (TBD) algorithm. By using extreme value theory we obtain explicit…
Expand
Highly Cited
1996
Highly Cited
1996
Constrained Discounted Dynamic Programming
E. Feinberg
,
A. Shwartz
Mathematics of Operations Research
1996
Corpus ID: 16123220
This paper deals with constrained optimization of Markov Decision Processes with a countable state space, compact action sets…
Expand
Highly Cited
1995
Highly Cited
1995
Dynamic Programming for Detecting, Tracking, and Matching Deformable Contours
D. Geiger
,
Alok Gupta
,
Luiz A. Costa
,
J. Vlontzos
IEEE Transactions on Pattern Analysis and Machine…
1995
Corpus ID: 5097328
The problem of segmenting an image into separate regions and tracking them over time is one of the most significant problems in…
Expand
Highly Cited
1994
Highly Cited
1994
The Solution and Estimation of Discrete Choice Dynamic Programming Models by Simulation and Interpol
M. Keane
,
K. Wolpin
1994
Corpus ID: 60839600
Over the past decade, a substantial literature on the estimation of discrete choice dynamic programming (DC-DP) models of…
Expand
Highly Cited
1994
Highly Cited
1994
Genetic algorithms compared to other techniques for pipe optimization
A. Simpson
,
G. Dandy
,
L. Murphy
1994
Corpus ID: 53978291
The genetic algorithm technique is a relatively new optimization technique. In this paper we present a methodology for optimizing…
Expand
Highly Cited
1977
Highly Cited
1977
Dynamic Programming and Stochastic Control
D. Bertsekas
,
C. White
IEEE Transactions on Systems, Man and Cybernetics
1977
Corpus ID: 47554622
What do you do to start reading dynamic programming and stochastic control? Searching the book that you love to read first or…
Expand
Highly Cited
1975
Highly Cited
1975
Convergence of discretization procedures in dynamic programming
D. Bertsekas
1975
Corpus ID: 14659729
The computational solution of discrete-time stochastic optimal control problems by dynamic programming requires, in most cases…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE