# Controlled Markov processes and viscosity solutions

@inproceedings{Fleming1992ControlledMP, title={Controlled Markov processes and viscosity solutions}, author={Wendell H. Fleming and Halil Mete Soner}, year={1992} }

This book is intended as an introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions. The authors approach stochastic control problems by the method of dynamic programming. The text provides an introduction to dynamic programming for deterministic optimal control problems, as well as to the corresponding theory of viscosity solutions. Also covered are controlled Markov diffusions and viscosity solutions of Hamilton-Jacobi-Bellman… Expand

#### 3,704 Citations

Controlled markov processes, viscosity solutions and applications to mathematical finance

- Mathematics
- 1997

The purpose of this section is to give a concise, nontechnical introduction to stochastic differential equations and to controlled diffusion processes. In the first subsection, we provide a brief… Expand

Solution of stochastic optimal control problems and ﬁnancial applications

- Mathematics
- 2017

In this paper, the stochastic optimal control problems, which frequently occur in economic and ﬁnance are investigated. First, using Bellman’s dynamic programming method the stochastic optimal… Expand

Tutorial for Viscosity Solutions in Optimal Control of Diffusions

- Mathematics
- 2010

This tutorial is an introduction to the theory of viscosity solutions of Hamilton-Jacobi-Bellman equations/inequalities in the realm of stochastic control problems. It is an easy to use reference for… Expand

Optimal Control of Nonlinear Systems with Controlled Transitions

- 2005

This paper studies the optimum stochastic control problem with piecewise deterministic dynamics. The controls enter through the system dynamics as well as the transitions for the underlying Markov… Expand

Viscosity Solutions for a System of Integro-PDEs and Connections to Optimal Switching and Control of Jump-Diffusion Processes

- Mathematics
- 2010

We develop a viscosity solution theory for a system of nonlinear degenerate parabolic integro-partial differential equations (IPDEs) related to stochastic optimal switching and control problems or… Expand

Optimal control of piecewise deterministic nonlinear systems with controlled transitions: viscosity solutions, their existence and uniqueness

- Mathematics
- Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304)
- 1999

The paper studies viscosity solutions of two sets of Hamilton-Jacob-Bellman (HJB) equations (one for finite horizon and the other one for infinite horizon) which arise in the optimal control of… Expand

Optimal control of ultradiffusion processes with application to mathematical finance

- Mathematics, Computer Science
- Int. J. Comput. Math.
- 2015

A method-of-lines finite element method is utilized to approximate the value function of a European style call option in a market subject to asset liquidity risk (including limit orders) and brokerage fees. Expand

Capacities, Measurable Selection and Dynamic Programming Part II: Application in Stochastic Control Problems

- Mathematics
- 2013

We aim to give an overview on how to derive the dynamic programming principle for a general stochastic control/stopping problem, using measurable selection techniques. By considering their martingale… Expand

Optimal Control Problems for Stochastic Reaction-Diffusion Systems with Non-Lipschitz Coefficients

- Mathematics, Computer Science
- SIAM J. Control. Optim.
- 2001

By using the dynamic programming approach, a control problem for a class of stochastic reaction-diffusion systems with coefficients having polynomial growth is studied, and a non-Lipschitz term appears in the cost functional, which allows the quadratic case to be treated. Expand

Stochastic control via direct comparison

- Mathematics, Computer Science
- Discret. Event Dyn. Syst.
- 2011

This paper introduces an alternative approach based on direct comparison of the performance of any two policies by modeling the state process as a continuous- time and continuous-state Markov process and applying the same ideas as for the discrete-time and discrete-state case. Expand

#### References

SHOWING 1-10 OF 116 REFERENCES

Additive control of stochastic linear systems with finite horizon

- Mathematics
- 1985

We consider a dynamic system whose state is governed by a linear stochastic differential equation with time-dependent coefficients. The control acts additively on the state of the system. Our… Expand

Exit probabilities and optimal stochastic control

- Mathematics
- 1977

This paper is concerned with Markov diffusion processes which obey stochastic differential equations depending on a small parameterε. The parameter enters as a coefficient in the noise term of the… Expand

Piecewise‐Deterministic Markov Processes: A General Class of Non‐Diffusion Stochastic Models

- Mathematics
- 1984

A general class of non-diffusion stochastic models is introduced with a view to providing a framework for studying optimization problems arising in queueing systems, inventory theory, resource… Expand

Stochastic Control for Small Noise Intensities

- Mathematics
- 1971

This paper is concerned with the approximate solution of stochastic optimal control problems which arise by perturbing the system equations in the deterministic Pontryagin control model, through an… Expand

Stochastic Control and Exit Probabilities of Jump Processes

- Mathematics
- 1985

A stochastic control problem is formulated for some problems related to Markov process. This formulation is in some sense a generalization of one used in [2], [3], [4], [8] for diffusion case. We… Expand

Optimal control of diffusion processes and hamilton–jacobi–bellman equations part 2 : viscosity solutions and uniqueness

- Mathematics
- 1983

We consider general optimal stochastic control problems and the associated Hamilton–Jacobi–Bellman equations. We develop a general notion of week solutions – called viscosity solutions – of the… Expand

Singular perturbations in manufacturing

- Mathematics
- 1993

An asymptotic analysis for a large class of stochastic optimization problems arising in manufacturing is presented. A typical example of the problems considered in this paper is a production planning… Expand

Regularity of the value function for a two-dimensional singular stochastic control problem

- Mathematics
- 1989

It is desired to control a two-dimensional Brownian motion by adding a (possibly singularly) continuous process to it so as to minimize an expected infinite-horizon discounted running cost. The… Expand

A class of singular stochastic control problems

- Mathematics
- 1983

We consider the problem of tracking a Brownian motion by a process of bounded variation, in such a way as to minimize total expected cost of both 'action' and 'deviation from a target state 0'. The… Expand

Some characterizations of optimal trajectories in control theory

- Mathematics
- 1991

Several characterizations of optimal trajectories for the classical Mayer problem in optimal control are provided. For this purpose the regularity of directional derivatives of the value function is… Expand