Skip to search formSkip to main contentSkip to account menu

Hamilton–Jacobi–Bellman equation

Known as: Bellman, Hamilton-Jacobi-Bellman equation, HJB equation 
The Hamilton–Jacobi–Bellman (HJB) equation is a partial differential equation which is central to optimal control theory. The solution of the HJB… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2017
2017
In this paper we study the fully nonlinear stochastic Hamilton-Jacobi-Bellman (HJB) equation for the optimal stochastic control… 
2015
2015
In this paper, we study the risk-averse control problem for diffusion processes. We make use of a forward–backward system of… 
2012
2012
There is an increasing number of applications whose trajectories are better modeled by discontinuous or impulsive trajectories… 
2010
2010
We test a new patch type for the patchy approximate solution to the Hamilton-Jacobi-Bellman equations, and we see an improvement… 
2010
2010
This thesis was concerned with a novel approach for Hamilton-Jacobi-Bellman equations. These partial differential equations arise… 
2009
2009
We begin a study of deterministic continuous-time controllable dynamical systems with a heuristic derivation of the Hamilton… 
2009
2009
We show how a minimal deformation of the geometry of the classical Hamilton-Jacobi equation provides a probabilistic theory whose… 
Highly Cited
2008
Highly Cited
2008
In order to ensure convergence to the viscosity solution, the standard method for discretizing Hamilton-Jacobi-Bellman partial… 
Highly Cited
2008
Highly Cited
2008
We consider a family {uϵ (t, x, ω)}, ϵ < 0, of solutions to the equation ∂uϵ/∂t + ϵΔuϵ/2 + H (t/ϵ, x/ϵ, ∇uϵ, ω) = 0 with the… 
2007
2007
This article proposes a new capture basin algorithm for computing the numerical solution of a class of Hamilton-Jacobi-Bellman…