About Speakers Schedule Contact Us INS
Young Researcher Workshop on Uncertainty Quantification and Machine Learning

Adaptive Gaussian mixture model based on implicit sampling for Bayesian inverse problems

Speaker

Lijian Jiang , Tongji University

Time

05 Jun, 10:00 - 10:30

Abstract

In the talk, I present an adaptive Gaussian mixture model (AGMM) based on implicit sampling method for Bayesian inverse problems. AGMM and implicit sampling are integrated to obtain an approximation Gaussian mixture model (GMM) of the posterior and importance ensemble samples. Here the number of Gaussian models is unknown and selected by the smoothed expectation-maximization (SmEM). The proposed method includes two steps: forecast step and analysis step. In the forecast step, a clustering of GMM provides the forecast of mean, covariance and weight by SmEM. For SmEM, the given ensemble samples are obtained by implicit sampling, which is used to generate the samples in high probability region. To construct the implicit map in implicit sampling, the means and covariances are necessary of the previous GMM. Then the importance samples are generated by the implicit map and the corresponding sample weights are the ratio between the importance density and posterior density. In the analysis step, the forecast of GMM will be updated by a modified Levenberg-Marquart (LM) method. The proposed method utilizes AGMM based on implicit sampling (AGMM-IS) to find an optimization solution of the likelihood function. This approach avoids the explicit computation for gradient matrix and Hessian matrix, which may be computationally expensive in high dimension spaces. The AGMM-IS method is applied to the non-Gaussian models for exploring the posterior of unknowns in inverse problems.