Abstract
We consider the conditional control problem introduced by Lions in his lectures at the Collège de France in November 2016. In his lectures, Lions emphasized some of the major differences with the analysis of classical stochastic optimal control problems, and in so doing, raised the question of the possible differences between the value functions resulting from optimization over the class of Markovian controls as opposed to the general family of open-loop controls. The goal of the paper is to elucidate this quandary and settle Lions’ original conjecture in the case of a relaxed version of the problem. First, we justify the mathematical formulation of the conditional control problem by the description of a practical model from evolutionary biology. Next, we relax the original formulation by the introduction of soft as opposed to hard killing, and using a mimicking argument, we reduce the open-loop optimization problem to an optimization over a specific class of feedback controls. After proving existence of optimal feedback control functions, we prove a superposition principle allowing us to recast the original stochastic control problems as deterministic control problems for dynamical systems of probability Gibbs measures. Next, we characterize the solutions by forward–backward systems of coupled nonlinear and nonlocal partial differential equations (PDEs), very much in the spirit of some of the mean field game (MFG) systems. From there, we identify a common optimizer, proving the conjecture of equality of the value functions. Finally, we illustrate the results with convincing numerical experiments.
Citation
René Carmona. Mathieu Laurière. Pierre-Louis Lions. "Nonstandard stochastic control with nonlinear Feynman–Kac costs." Illinois J. Math. 68 (3) 577 - 637, September 2024. https://doi.org/10.1215/00192082-11416739
Information