In this talk, we provide sensitivity analysis for two types of problems: robust optimization and nonlinear Kolmogorov partial differential equations (PDEs). Our first goal is to quantify the sensitivity of a given robust optimization problem to model uncertainty. This can be achieved by showing that the robust problem can be approximated as ε approaches 0 by the baseline problem, computed using baseline processes. Subsequently, we aim to quantify the sensitivity of these PDEs to small nonlinearities in the Hamiltonian of drift and diffusion coefficients, and then use the results to develop an efficient numerical method for their approximation. We demonstrate that as ε approaches 0, the nonlinear Kolmogorov PDE can be approximated by linear Kolmogorov PDEs involving the baseline coefficients. Our derived sensitivity analysis then provides a Monte Carlo-based numerical method that can efficiently solve these nonlinear Kolmogorov PDEs. This talk is based on joint works with Daniel Bartl (Univ. Vienna) and Ariel Neufeld (NTU Singapore).