## Approximate Bayesian inference methods for stochastic state space models

Research output: Book/Report › Doctoral thesis › Collection of Articles

### Details

Original language | English |
---|---|

Publisher | Tampere University of Technology |

Number of pages | 56 |

ISBN (Electronic) | 978-952-15-4106-3 |

ISBN (Print) | 978-952-15-4091-2 |

Publication status | Published - 23 Feb 2018 |

Publication type | G5 Doctoral dissertation (article) |

### Publication series

Name | Tampere University of Technology. Publication |
---|---|

Volume | 1528 |

ISSN (Print) | 1459-2045 |

### Abstract

This thesis collects together research results obtained during my doctoral studies related to approximate Bayesian inference in stochastic state-space models. The published research spans a variety of topics including 1) application of Gaussian ﬁltering in satellite orbit prediction, 2) outlier robust linear regression using variational Bayes (VB) approximation, 3) ﬁltering and smoothing in continuous-discrete Gaussian models using VB approximation and 4) parameter estimation using twisted particle ﬁlters. The main goal of the introductory part of the thesis is to connect the results to the general framework of estimation of state and model parameters and present them in a uniﬁed manner.

Bayesian inference for non-linear state space models generally requires use of approximations, since the exact posterior distribution is readily available only for a few special cases. The approximation methods can be roughly classiﬁed into to groups: deterministic methods, where the intractable posterior distribution is approximated from a family of more tractable distributions (e.g. Gaussian and VB approximations), and stochastic sampling based methods (e.g. particle ﬁlters). Gaussian approximation refers to directly approximating the posterior with a Gaussian distribution, and can be readily applied for models with Gaussian process and measurement noise. Well known examples are the extended Kalman ﬁlter and sigma-point based unscented Kalman ﬁlter. The VB method is based on minimizing the Kullback-Leibler divergence of the true posterior with respect to the approximate distribution, chosen from a family of more tractable simpler distributions.

The ﬁrst main contribution of the thesis is the development of a VB approximation for linear regression problems with outlier robust measurement distributions. A broad family of outlier robust distributions can be presented as an inﬁnite mixture of Gaussians, called Gaussian scale mixture models, and include e.g. the t-distribution, the Laplace distribution and the contaminated normal distribution. The VB approximation for the regression problem can be readily extended to the estimation of state space models and is presented in the introductory part.

VB approximations can be also used for approximate inference in continuous-discrete Gaussian models, where the dynamics are modeled with stochastic differential equations and measurements are obtained at discrete time instants. The second main contribution is the presentation of a VB approximation for these models and the explanation of how the resulting algorithm connects to the Gaussian ﬁltering and smoothing framework.

The third contribution of the thesis is the development of parameter estimation using particle Markov Chain Monte Carlo (PMCMC) method and twisted particle ﬁlters. Twisted particle ﬁlters are obtained from standard particle ﬁlters by applying a special weighting to the sampling law of the ﬁlter. The weighting is chosen to minimize the variance of the marginal likelihood estimate, and the resulting particle ﬁlter is more efficient than conventional PMCMC algorithms. The exact optimal weighting is generally not available, but can be approximated using the Gaussian ﬁltering and smoothing framework.

Bayesian inference for non-linear state space models generally requires use of approximations, since the exact posterior distribution is readily available only for a few special cases. The approximation methods can be roughly classiﬁed into to groups: deterministic methods, where the intractable posterior distribution is approximated from a family of more tractable distributions (e.g. Gaussian and VB approximations), and stochastic sampling based methods (e.g. particle ﬁlters). Gaussian approximation refers to directly approximating the posterior with a Gaussian distribution, and can be readily applied for models with Gaussian process and measurement noise. Well known examples are the extended Kalman ﬁlter and sigma-point based unscented Kalman ﬁlter. The VB method is based on minimizing the Kullback-Leibler divergence of the true posterior with respect to the approximate distribution, chosen from a family of more tractable simpler distributions.

The ﬁrst main contribution of the thesis is the development of a VB approximation for linear regression problems with outlier robust measurement distributions. A broad family of outlier robust distributions can be presented as an inﬁnite mixture of Gaussians, called Gaussian scale mixture models, and include e.g. the t-distribution, the Laplace distribution and the contaminated normal distribution. The VB approximation for the regression problem can be readily extended to the estimation of state space models and is presented in the introductory part.

VB approximations can be also used for approximate inference in continuous-discrete Gaussian models, where the dynamics are modeled with stochastic differential equations and measurements are obtained at discrete time instants. The second main contribution is the presentation of a VB approximation for these models and the explanation of how the resulting algorithm connects to the Gaussian ﬁltering and smoothing framework.

The third contribution of the thesis is the development of parameter estimation using particle Markov Chain Monte Carlo (PMCMC) method and twisted particle ﬁlters. Twisted particle ﬁlters are obtained from standard particle ﬁlters by applying a special weighting to the sampling law of the ﬁlter. The weighting is chosen to minimize the variance of the marginal likelihood estimate, and the resulting particle ﬁlter is more efficient than conventional PMCMC algorithms. The exact optimal weighting is generally not available, but can be approximated using the Gaussian ﬁltering and smoothing framework.

### Field of science, Statistics Finland

### Downloads statistics

No data available