Dynamic Programming Conditions for Partially Observable Stochastic Systems

@article{Davis1973DynamicPC,
  title={Dynamic Programming Conditions for Partially Observable Stochastic Systems},
  author={M. Davis and P. Varaiya},
  journal={Siam Journal on Control},
  year={1973},
  volume={11},
  pages={226-261}
}
In this paper necessary and sufficient conditions for optimality are derived for systems described by stochastic differential equations with control based on partial observations. The solution of the system is defined in a way which permits a very wide class of admissible controls, and then Hamilton–Jacobi criteria for optimality are derived from a version of Bellman’s “principle of optimality.”The method of solution is based on a result of Girsanov : Wiener measure is transformed for each… Expand
Optimality criteria for controlled discontinous processes
Optimal control of semi-Markov processes with a backward stochastic differential equations approach
Optimal Control of Point Processes with Noisy Observations: The Maximum Principle
On Feedback Control of Linear Stochastic Systems
...
1
2
3
4
5
...

References

T.E.Duncan and P.P.Varaiya, On the solutions of a stochastic control system
  • SIAM J. Control £
  • 1971