Date | Aug 11, 2017 |
---|---|

Speaker | Viet Ha Hoang |

Dept. | NTU, Singapore |

Room | 129-310 |

Time | 16:00-17:00 |

Solving partial differential equations with stochastic coefficients is exceedingly complicated. The Monte Carlo method, which computes expectation of the solutions, requires an enormous amount of computation resources, which surpasses the current available computers' capacity. In this talk, we review the recent research on the (quasi-) best N term approximation of the generalized polynomial chaos expansion of the solution, and the adaptive approximation of these terms according to the magnitude of their norms. The method achieves a prescribed level of accuracy with optimal complexity. We consider both the case where the coefficients are uniformly bounded, and the far more complicated case where the coefficients are of the log-normal form (i.e. the logarithm of the coefficient follows the normal distribution) and can get arbitrarily close to zero, and arbitrarily large.

If time permits, we discuss our recent work on Bayesian inverse problems where the coefficient of a (forward) partial differential equation is constructed from limited available information on the solution. We find the posterior probability measure of the coefficient, which is the conditional probability given the noisy information, in a prior probability space. This measure is sampled by the Markov Chain Monte Carlo (MCMC) method. The plain MCMC method solves a large number of realizations of the forward equation, and is prohibitively expensive. We introduce two methods that accelerate this process drastically. We show that the generalized polynomial chaos MCMC method, which employs the best N term approximation above, can do the task with optimal complexity. We then review the multilevel MCMC method, (which is first introduced in V. H. Hoang, Ch. Schwab and A. M. Stuart (2013), Complexity analysis of accelerated MCMC methods for Bayesian inversion, Inverse Problems,

29, 085010, 37 pp, for Bayesian inverse problems with bounded coefficients and extended recently for Gaussian priors for unbounded coefficients), also samples this posterior measure with optimal complexity.

This is the joint work with Jia Hao Quek (NTU, Singapore), Christoph Schwab (ETH Zurich, Switzerland), Andrew Stuart (Caltech, USA).

If time permits, we discuss our recent work on Bayesian inverse problems where the coefficient of a (forward) partial differential equation is constructed from limited available information on the solution. We find the posterior probability measure of the coefficient, which is the conditional probability given the noisy information, in a prior probability space. This measure is sampled by the Markov Chain Monte Carlo (MCMC) method. The plain MCMC method solves a large number of realizations of the forward equation, and is prohibitively expensive. We introduce two methods that accelerate this process drastically. We show that the generalized polynomial chaos MCMC method, which employs the best N term approximation above, can do the task with optimal complexity. We then review the multilevel MCMC method, (which is first introduced in V. H. Hoang, Ch. Schwab and A. M. Stuart (2013), Complexity analysis of accelerated MCMC methods for Bayesian inversion, Inverse Problems,

29, 085010, 37 pp, for Bayesian inverse problems with bounded coefficients and extended recently for Gaussian priors for unbounded coefficients), also samples this posterior measure with optimal complexity.

This is the joint work with Jia Hao Quek (NTU, Singapore), Christoph Schwab (ETH Zurich, Switzerland), Andrew Stuart (Caltech, USA).

TEL 02-880-5857,6530,6531 / FAX 02-887-4694