Honorary Professor
University of Geneva, Research Center for Statistics and Geneva School of Economics and Management
Abstract: In the past several decades, there has been an important development of the theory and applications of robust statistics. This has taken place mainly within the frequentist framework, while fewer results have concerned the Bayesian approach. Since robust statistics deals with deviations from ideal models and develops statistical procedures which are still reliable and reasonably efficient in a neighborhood of the model, the issue of the stability of inference in the presence of small deviations from the assumptions should clearly concern both approaches. This is even more important nowadays, where the analysis and the modelling of complex data are required in many fields, in particular for the development of AI technology. Fortunately, in the past decade with the development of powerful algorithms, the robustness issue has gained importance within the Bayesian framework.
In this talk, we discuss some of these recent developments by focusing on two main aspects. First, we outline the transfer of some fundamental ideas and tools from the classical theory of robust statistics (including M-estimation and testing and Huber’s minimax theory) to the Bayesian setup. One implication of this transfer is the recommendation to replace exact likelihoods with Huber’s least favorable distributions when sampling from posterior distributions. Secondly, we discuss the difficulty of obtaining exact finite sample results in Bayesian robustness, while outlining a proposal, which aims to combine asymptotic guarantees with exact finite sample bounds. Finally, we briefly illustrate how the Bayesian filter can be robustified.
Keywords: Asymptotic guarantees, robust Bayesian filter, finite sample bounds, least favorable distributions, minimax theory
