Loading…

Scalable inference with autoregressive neural ratio estimation

In recent years, there has been a remarkable development of simulation-based inference (SBI) algorithms, and they have now been applied across a wide range of astrophysical and cosmological analyses. There are a number of key advantages to these methods, centred around the ability to perform scalabl...

Full description

Saved in:
Bibliographic Details
Published in:Monthly notices of the Royal Astronomical Society 2024-05, Vol.530 (4), p.4107-4124
Main Authors: Anau Montel, Noemi, Alvey, James, Weniger, Christoph
Format: Article
Language:English
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In recent years, there has been a remarkable development of simulation-based inference (SBI) algorithms, and they have now been applied across a wide range of astrophysical and cosmological analyses. There are a number of key advantages to these methods, centred around the ability to perform scalable statistical inference without an explicit likelihood. In this work, we propose two technical building blocks to a specific sequential SBI algorithm, truncated marginal neural ratio estimation (TMNRE). In particular, first, we develop autoregressive ratio estimation with the aim to robustly estimate correlated high-dimensional posteriors. Secondly, we propose a slice-based nested sampling algorithm to efficiently draw both posterior samples and constrained prior samples from ratio estimators, the latter being instrumental for sequential inference. To validate our implementation, we carry out inference tasks on three concrete examples: a toy model of a multidimensional Gaussian, the analysis of a stellar stream mock observation, and finally, a proof-of-concept application to substructure searches in strong gravitational lensing. In addition, we publicly release the code for both the autoregressive ratio estimator and the slice sampler.
ISSN:0035-8711
1365-2966
DOI:10.1093/mnras/stae1130