# American Institute of Mathematical Sciences

doi: 10.3934/dcdss.2021045
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

## Resampled ensemble Kalman inversion for Bayesian parameter estimation with sequential data

 1 School of Mathematical Sciences, Shanghai Jiao Tong University, 800 Dongchuan Rd, Shanghai 200240, China 2 School of Mathematics, University of Birmingham, Edgbaston Birmingham, B15 2TT, UK

* Corresponding author: Jinglai Li

Received  January 2021 Revised  February 2021 Early access April 2021

Fund Project: The work was supported by NSFC under grant number 11771289

Many real-world problems require to estimate parameters of interest in a Bayesian framework from data that are collected sequentially in time. Conventional methods to sample the posterior distributions, such as Markov Chain Monte Carlo methods can not efficiently deal with such problems as they do not take advantage of the sequential structure. To this end, the Ensemble Kalman inversion (EnKI), which updates the particles whenever a new collection of data arrive, becomes a popular tool to solve this type of problems. In this work we present a method to improve the performance of EnKI, which removes some particles that significantly deviate from the posterior distribution via a resampling procedure. Specifically we adopt an idea developed in the sequential Monte Carlo sampler, and simplify it to compute an approximate weight function. Finally we use the computed weights to identify and remove those particles seriously deviating from the target distribution. With numerical examples, we demonstrate that, without requiring any additional evaluations of the forward model, the proposed method can improve the performance of standard EnKI in certain class of problems.

Citation: Jiangqi Wu, Linjie Wen, Jinglai Li. Resampled ensemble Kalman inversion for Bayesian parameter estimation with sequential data. Discrete & Continuous Dynamical Systems - S, doi: 10.3934/dcdss.2021045
##### References:

show all references

##### References:
The simulated data for $\sigma = 0.4$ (left) and $\sigma = 0.6$ (right). The lines show the simulated states in continuous time and the dots are the noisy observations
The results for the case where noise variance is $0.4^2$. Left: the error for the simulation with $100$ particles. Right: the error for simulation with $500$ particles
The results for the case where noise variance is $0.6^2$. Left: the error for the simulation with $100$ particles. Right: the error for simulation with $500$ particles
$\Delta = 0.05.$ Top: the average estimation error of the simulation with 2000 particles. Bottom: the average estimation error of the simulation with 5000 particles. In both rows, the left figure shows the results of the observed dimensions and the right one shows the unobserved ones
$\Delta = 0.1.$ Top: the average estimation error of the simulation with 2000 particles. Bottom: the average estimation error of the simulation with 5000 particles. In both rows, the left figure shows the results of the observed dimensions and the right one shows the unobserved ones
Left: The ground truth for x. Right: Both the noise-free and the noisy data at t = 3
The estimation error for the high dimensional nonlinear example. Left: the results for $M = 2000$; Right: the results for $M = 5000$
 [1] Neil K. Chada, Claudia Schillings, Simon Weissmann. On the incorporation of box-constraints for ensemble Kalman inversion. Foundations of Data Science, 2019, 1 (4) : 433-456. doi: 10.3934/fods.2019018 [2] Zhiyan Ding, Qin Li, Jianfeng Lu. Ensemble Kalman Inversion for nonlinear problems: Weights, consistency, and variance bounds. Foundations of Data Science, 2020  doi: 10.3934/fods.2020018 [3] Andreas Bock, Colin J. Cotter. Learning landmark geodesics using the ensemble Kalman filter. Foundations of Data Science, 2021  doi: 10.3934/fods.2021020 [4] Lassi Roininen, Mark Girolami, Sari Lasanen, Markku Markkanen. Hyperpriors for Matérn fields with applications in Bayesian inversion. Inverse Problems & Imaging, 2019, 13 (1) : 1-29. doi: 10.3934/ipi.2019001 [5] Marc Bocquet, Alban Farchi, Quentin Malartic. Online learning of both state and dynamics using ensemble Kalman filters. Foundations of Data Science, 2020  doi: 10.3934/fods.2020015 [6] Junyoung Jang, Kihoon Jang, Hee-Dae Kwon, Jeehyun Lee. Feedback control of an HBV model based on ensemble kalman filter and differential evolution. Mathematical Biosciences & Engineering, 2018, 15 (3) : 667-691. doi: 10.3934/mbe.2018030 [7] Håkon Hoel, Gaukhar Shaimerdenova, Raúl Tempone. Multilevel Ensemble Kalman Filtering based on a sample average of independent EnKF estimators. Foundations of Data Science, 2020, 2 (4) : 351-390. doi: 10.3934/fods.2020017 [8] Neil K. Chada, Yuming Chen, Daniel Sanz-Alonso. Iterative ensemble Kalman methods: A unified perspective with some new variants. Foundations of Data Science, 2021  doi: 10.3934/fods.2021011 [9] Le Yin, Ioannis Sgouralis, Vasileios Maroulas. Topological reconstruction of sub-cellular motion with Ensemble Kalman velocimetry. Foundations of Data Science, 2020, 2 (2) : 101-121. doi: 10.3934/fods.2020007 [10] Sebastian Springer, Heikki Haario, Vladimir Shemyakin, Leonid Kalachev, Denis Shchepakin. Robust parameter estimation of chaotic systems. Inverse Problems & Imaging, 2019, 13 (6) : 1189-1212. doi: 10.3934/ipi.2019053 [11] Azmy S. Ackleh, Jeremy J. Thibodeaux. Parameter estimation in a structured erythropoiesis model. Mathematical Biosciences & Engineering, 2008, 5 (4) : 601-616. doi: 10.3934/mbe.2008.5.601 [12] Matti Lassas, Eero Saksman, Samuli Siltanen. Discretization-invariant Bayesian inversion and Besov space priors. Inverse Problems & Imaging, 2009, 3 (1) : 87-122. doi: 10.3934/ipi.2009.3.87 [13] Z. G. Feng, Kok Lay Teo, N. U. Ahmed, Yulin Zhao, W. Y. Yan. Optimal fusion of sensor data for Kalman filtering. Discrete & Continuous Dynamical Systems, 2006, 14 (3) : 483-503. doi: 10.3934/dcds.2006.14.483 [14] Simon Hubmer, Andreas Neubauer, Ronny Ramlau, Henning U. Voss. On the parameter estimation problem of magnetic resonance advection imaging. Inverse Problems & Imaging, 2018, 12 (1) : 175-204. doi: 10.3934/ipi.2018007 [15] Alex Capaldi, Samuel Behrend, Benjamin Berman, Jason Smith, Justin Wright, Alun L. Lloyd. Parameter estimation and uncertainty quantification for an epidemic model. Mathematical Biosciences & Engineering, 2012, 9 (3) : 553-576. doi: 10.3934/mbe.2012.9.553 [16] Robert Azencott, Yutheeka Gadhyan. Accurate parameter estimation for coupled stochastic dynamics. Conference Publications, 2009, 2009 (Special) : 44-53. doi: 10.3934/proc.2009.2009.44 [17] Lassi Roininen, Janne M. J. Huttunen, Sari Lasanen. Whittle-Matérn priors for Bayesian statistical inversion with applications in electrical impedance tomography. Inverse Problems & Imaging, 2014, 8 (2) : 561-586. doi: 10.3934/ipi.2014.8.561 [18] Gianni Gilioli, Sara Pasquali, Fabrizio Ruggeri. Nonlinear functional response parameter estimation in a stochastic predator-prey model. Mathematical Biosciences & Engineering, 2012, 9 (1) : 75-96. doi: 10.3934/mbe.2012.9.75 [19] Azmy S. Ackleh, H.T. Banks, Keng Deng, Shuhua Hu. Parameter Estimation in a Coupled System of Nonlinear Size-Structured Populations. Mathematical Biosciences & Engineering, 2005, 2 (2) : 289-315. doi: 10.3934/mbe.2005.2.289 [20] Chongyang Liu, Meijia Han, Zhaohua Gong, Kok Lay Teo. Robust parameter estimation for constrained time-delay systems with inexact measurements. Journal of Industrial & Management Optimization, 2021, 17 (1) : 317-337. doi: 10.3934/jimo.2019113

2020 Impact Factor: 2.425