Martingale methods in stochastic control

  title={Martingale methods in stochastic control},
  author={Mark H. A. Davis},
Abstract : The martingale treatment of stochastic control problems is based on the idea that the correct formulation of Bellman's principle of optimality for stochastic minimization problems is in terms of a submartingale inequality: the value function of dynamic programming is always a submartingale and is a martingale under a particular control strategy if and only if that strategy is optimal. Local conditions for optimality in the form of a minimum principle can be obtained by applying Meyer… Expand
Martingale approach to control for general jump processes
Optimal Control of Markov Processes.
Martingale approach to stochastic differential games of control and stopping.
Stochastic Near-Optimal Controls for Path-Dependent Systems
Two Approaches to Non-Zero-Sum Stochastic Differential Games of Control and Stopping
Weak Functional It\^o Calculus and Applications