Andrew Stuart, University of Warwick

What makes filtering hard?

Coauthors
S. Agapiou, O. Papaspiliopoulos and D. Sanz-Alonso

Abstract:

At the heart of many data assimilation algorithms is a filter of some sort. The particle filter is the gold standard in this context, because of its provable approximation properties. However, in practice, the particle filter behaves poorly in many geophysical problems because of weight collapse. As a result ensemble filters (the EnKF and variants), in which all particles are equally weighted, are preferred in many applications despite the fact that their statistical approximation properties are less well-understood. Nonetheless there is currently a great deal of research activity associated with the particle filter, in the context of geophysical applications. The purpose of this talk is to overview and unify that body of work.

The particle filter has at its core the idea of importance sampling. The basic idea of importance sampling is to use independent samples from one probability distribution in order to approximate another probability distribution. Knowing how many samples are needed is key to the efficiency of the method, and hence to understanding when it will be effective and when it will not be. The talk will focus on understanding this issue, and in particular studying how properties of the assimilation problem, such as model dimension, data dimension and size of noise, play in to the question of the efficiency of importance sampling, and hence of filtering.

This is joint work with S. Agapiou, O. Papaspiliopoulos and D. Sanz- Alonso. It may be found as arXiv.1511.06196

File available here