Resumo: | When solving optimal impulse control problems, one can use the dynamic programming approach in two different ways: at each time moment, one can make the decision whether to apply a particular type of impulse, leading to the instantaneous change of the state, or apply no impulses at all; or, otherwise, one can plan an impulse after a certain interval, so that the length of that interval is to be optimized along with the type of that impulse. The first method leads to the differential Bellman equation, while the second method leads to the integral Bellman equation. The target of the current article is to prove the equivalence of those Bellman equations. Firstly, we prove that, for the simple deterministic optimal stopping problem, the equations in the integral and differential form are equivalent under very mild conditions. Here, the impulse means that the uncontrolled process is stopped, i.e., sent to the so called cemetery. After that, the obtained result immediately implies the similar equivalence of the Bellman equations for other models of optimal impulse control. Those include abstract dynamical systems, controlled ordinary differential equations, piece-wise deterministic Markov processes and continuous-time Markov decision processes.
|