Distilling importance sampling for likelihoodfree inference
11/01/2023 Wednesday 11th January 2023, 14:00 ()
More
Dennis Prangle, University of Bristol, England
Likelihoodfree inference involves inferring parameter values given observed data and a simulator model. The simulator is computer code taking the parameters, performing stochastic calculations, and outputting simulated data. In this work, we view the simulator as a function whose inputs are (1) the parameters and (2) a vector of pseudorandom draws, and attempt to infer all these inputs. This is challenging as the resulting posterior can be high dimensional and involve strong dependence. We approximate the posterior using normalizing flows, a flexible parametric family of densities. Training data is generated by ABC importance sampling with a large bandwidth parameter. This is \"distilled\" by using it to train the normalising flow parameters. The process is iterated, using the updated flow as the importance sampling proposal, and slowly reducing the ABC bandwidth until a proposal is generated for a good approximation to the posterior. Unlike most other likelihoodfree methods, we avoid the need to reduce data to low dimensional summary statistics, and hence can achieve more accurate results.
