Visitez notre page

 

 

 

 

 

 


Effortless bias reduction in self-normalised importance sampling

Séminaire
Organisme intervenant (ou équipe pour les séminaires internes)
CMAP, Ecole polytechnique
Nom intervenant
Gabriel Cardoso
Résumé

Importance Sampling (IS) is a method for approximating expectations under a target distribution using independent samples from a proposal distribution and the associated importance weights. 

In many applications, the target distribution is known only up to a normalization constant, in which case self-normalized IS (SNIS) can be used. 

This is notably the case for posterior sampling in most Bayesian models.
While the use of self-normalization can have a positive effect on the dispersion of the estimator, it introduces bias. 

In this talk, I'll discuss the issue of bias in (SNIS) and I'll present a new method, BR-SNIS, whose complexity is essentially the same as that of SNIS and
which significantly reduces bias without a significant increasing in the variance. 

This method is a wrapper in the sense that it uses the same proposal samples and importance weights as SNIS, but makes clever use of iterated sampling–importance resampling (i-SIR) to form a bias-reduced version of the estimator. I'll also talk about the extension of this idea to Particle Smoothing.
This talk is based on the following paper: https://proceedings.neurips.cc/paper_files/paper/2022/hash/04bd683d5428d91c5fbb5a7d2c27064d-Abstract-Conference.html

Lieu
Amphi C2.0.37
Date du jour
Date de fin du Workshop