Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 228,398,550 papers from all fields of science
Search
Sign In
Create Free Account
Markov decision process
Known as:
Value iteration
, Policy iteration
, Markov decision problems
Expand
Markov decision processes (MDPs) provide a mathematical framework for modeling decision making in situations where outcomes are partly random and…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
49 relations
Apprenticeship learning
Artificial neural network
Automatic control
Backward induction
Expand
Broader (2)
Dynamic programming
Stochastic control
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2011
Highly Cited
2011
Multigrid Methods for
Hamilton-Jacobi-Bellman
,
Hamilton-Jacobi Equations
,
D. Han
2011
Corpus ID: 2233819
We propose multigrid methods for solving Hamilton-Jacobi-Bellman (HJB) and HamiltonJacobi-Bellman-Isaacs (HJBI) equations. The…
Expand
Highly Cited
2008
Highly Cited
2008
A quality of service negotiation-based vertical handoff decision scheme in heterogeneous wireless systems
Qingyang Song
,
A. Jamalipour
European Journal of Operational Research
2008
Corpus ID: 2976978
Highly Cited
2007
Highly Cited
2007
On the Access Pricing and Network Scaling Issues of Wireless Mesh Networks
R. K. Lam
,
D. Chiu
,
John C.S. Lui
IEEE transactions on computers
2007
Corpus ID: 16792512
Distributed wireless mesh network technology is ready for public deployment in the near future. However, without an incentive…
Expand
2006
2006
Existence of Optimal Policies for Semi-Markov Decision Processes Using Duality for Infinite Linear Programming
D. Klabjan
,
Daniel Adelman
SIAM Journal of Control and Optimization
2006
Corpus ID: 11806830
Semi-Markov decision processes on Borel spaces with deterministic kernels have many practical applications, particularly in…
Expand
Highly Cited
1998
Highly Cited
1998
Structured Reachability Analysis for Markov Decision Processes
Craig Boutilier
,
R. Brafman
,
C. Geib
Conference on Uncertainty in Artificial…
1998
Corpus ID: 1055911
Recent research in decision theoretic planning has focussed on making the solution of Markov decision processes (MDPs) more…
Expand
1992
1992
A Weighted Markov Decision Process
D. Krass
,
J. Filar
,
S. Sinha
Operational Research
1992
Corpus ID: 6196399
The two most commonly considered reward criteria for Markov decision processes are the discounted reward and the long-term…
Expand
Highly Cited
1988
Highly Cited
1988
Hierarchic Markov processes and their applications in replacement models
A. Kristensen
1988
Corpus ID: 60853682
Review
1978
Review
1978
Contracting Markov decision processes
V. Nunen
1978
Corpus ID: 126118263
• A submitted manuscript is the author's version of the article upon submission and before peer-review. There can be important…
Expand
Review
1978
Review
1978
Contracting Markov Decision Processes
A. Unwin
1978
Corpus ID: 62472393
• A submitted manuscript is the author's version of the article upon submission and before peer-review. There can be important…
Expand
1977
1977
Markov decision processes with unbounded rewards
J. Wessels
,
V. Nunen
1977
Corpus ID: 118917333
Markov decision processes which allow for an unbounded reward structure are considered. Conditions are given which allow…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE