Abstract
Bayesian inference is a key method for estimating parametric uncertainty from data. However, most Bayesian inference methods require the explicit likelihood function or many samples, both of which are unrealistic to provide for complex first-principles-based models. Here, we propose a novel Bayesian inference methodology for estimating uncertain parameters of computationally intensive first-principles-based models. Our approach exploits both low-complexity surrogate models and variational inference with arbitrarily expressive inference models. The proposed methodology indirectly predicts output responses and casts Bayesian inference as an optimization problem. We demonstrate its performance via synthetic problems, computational fluid dynamics, and kinetic Monte Carlo simulation to verify its applicability. This fast and reliable methodology enables us to capture multimodality and the shape of complicated posterior distributions with the quality of state-of-the-art Hamiltonian Monte Carlo methods but with much lower computation cost.
Original language | English |
---|---|
Article number | 107322 |
Journal | Computers and Chemical Engineering |
Volume | 151 |
DOIs | |
State | Published - Aug 2021 |
Bibliographical note
Publisher Copyright:© 2021 Elsevier Ltd
Keywords
- Bayesian inference
- adversarial network
- first-principles simulation
- machine learning
- parameter estimation
- uncertainty