A diffusion approximation theorem for a nonlinear PDE with application to random birefringent optical fibers.
Résumé
In this article, we propose a generalization of the theory of diffusion approximation for random ODE to a nonlinear system of random Schrödinger equations. This system arises in the study of pulse propagation in randomly birefringent optical fibers. We first show existence and uniqueness of solutions for the random PDE and the limiting equation. We follow the work of Garnier-Marty, where a linear electric field is considered, and we get an asymptotic dynamic for the nonlinear electric field.
Origine : Fichiers produits par l'(les) auteur(s)