65th ISI World Statistics Congress

65th ISI World Statistics Congress

TIES: AEEI award plenary presentation: Fast Computer Model Calibration using Annealed and Transformed Variational Inference

Conference

65th ISI World Statistics Congress

Format: SIPS Abstract - WSC 2025

Keywords: inference, variational

Session: SIPS 1109 - TIES President's Invited Speaker

Monday 6 October 9:20 a.m. - 10:30 a.m. (Europe/Amsterdam)

Abstract

This session will provide a short introduction to The International Environmetrics Society (TIES) and to our Abdel El-Shaarawi Early Investigator (AEEI) Award from the President (2025-2027) Dr. Nathaniel Newlands. The session will then proceed with the main presentation from the 2025 award winner, Prof. Won Chang.

Presentation: Computer models play a crucial role in numerous scientific and engineering domains. To ensure the accuracy of simulations, it is essential to properly calibrate the input parameters of these models through statistical inference. While Bayesian inference is the standard approach for this task, employing Markov chain Monte Carlo methods often encounters computational hurdles due to the costly evaluation of likelihood functions and slow mixing rates. Although variational inference (VI) can be a fast alternative to traditional Bayesian approaches, VI has limited applicability due to boundary issues and local optima problems. To address these challenges, we propose flexible VI methods based on deep generative models that do not require parametric assumptions on the variational distribution. We embed a surjective transformation in our framework to avoid posterior truncation at the boundary. Additionally, we provide theoretical conditions that guarantee the success of the algorithm. Furthermore, our temperature annealing scheme and fine-tuning can prevent being trapped in local optima through a series of intermediate posteriors and weight adjustment. We apply our method to infectious disease models and a geophysical model, illustrating that the proposed method can provide fast and accurate inference compared to its competitors.