Abstract
In this article we propose a generalization of the theory of diffusion approximation for random ODE to a nonlinear system of random Schrödinger equations. This system arises in the study of pulse propagation in randomly birefringent optical fibers. We first show existence and uniqueness of solutions for the random PDE and the limiting equation. We follow the work of Garnier and Marty [Wave Motion 43 (2006) 544–560], Marty [Problèmes d’évolution en milieux aléatoires: Théorèmes limites, schémas numériques et applications en optique (2005) Univ. Paul Sabatier], where a linear electric field is considered, and we get an asymptotic dynamic for the nonlinear electric field.
Citation
A. de Bouard. M. Gazeau. "A diffusion approximation theorem for a nonlinear PDE with application to random birefringent optical fibers." Ann. Appl. Probab. 22 (6) 2460 - 2504, December 2012. https://doi.org/10.1214/11-AAP839
Information