Advanced Search
Article Contents
Article Contents
Early Access

Early Access articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Early Access publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Early Access articles via the “Early Access” tab for the selected journal.

An adaptive mixture view of particle filters

  • *Corresponding author: Nicola Branchini

    *Corresponding author: Nicola Branchini 
Abstract Full Text(HTML) Figure(4) / Table(1) Related Papers Cited by
  • Particle filters (PFs) are algorithms that approximate the so-called filtering distributions in complex state-space models. We present a unified view on PFs as importance sampling with adaptive mixture proposals. Existing PFs can be derived as special cases by making specific choices for the components of the mixture proposals and for the importance weights. Our perspective clarifies that the existing PFs implicitly construct particular mixture proposals where the components are chosen independently of each other. We exploit the introduced flexibility of our perspective to propose a class of algorithms, adaptive mixture particle filters (AM-PF). Following IS arguments, the aim is to optimize the mixture proposal to match (an approximation of) the filtering posterior. We discuss two particular cases of the framework, the improved APF (IAPF) and the optimized APF (OAPF). In both linear and nonlinear dynamical systems models, our mixture particle filters consistently show improved performance compared to widely used algorithms such as the bootstrap particle filter (BPF) and the auxiliary particle filter (APF). We conclude by outlining promising future directions opened by our framework.1

    Mathematics Subject Classification: Primary: 58F15, 58F17; Secondary: 53C35.


    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  Toy Example. In this experiment we show that the proposals of AM-PF-based algorithms (IAPF and OAPF) are closer to true posteriors compared to the competitors. In the first row, we show the transition kernels $ f({\bf x}_t | {\bf x}_{t-1}^{(m)}) $ and the likelihood associated with the true filtering distribution. We calculated $ \chi^2 $-divergence for these examples in Table 1. Note that here OAPF uses transition kernels for the proposal and their means $ \boldsymbol{\mu}_{t | t-1}^{(m)} $ as evaluation points, with $ K = M $

    Figure 2.  (Linear Gaussian SSM.) Normalized (divided by the true value) MSE for the mean of a posterior distribution in the linear Gaussian state-space model, with 100 Monte Carlo replications. In this experiment, $ K $ was set to $ 5 $ for all numbers of particles, retaining performance. We found similar qualitative results for other values of the SSM parameters

    Figure 3.  (Lorenz 63 system.) The plot shows effective sample size (ESS) over timesteps for $ 1000 $ Monte Carlo replications. In the legend we show the value of $ K $, which was set based on $ M = 1000 $, the number of particles. In this example, we show that unlike in the linear Gaussian system, unfortunately reducing too much the number of kernels $ K $ impacts performance for these settings of the system parameters. When $ K = M $, OAPF outperforms the other algorithms, which suggests that indeed the proposal constructed by the optimization of the mixture weights better matches the posterior distribution

    Figure 4.  Here, we show for the Lorenz 63 experiment, (ⅰ) runtime (in seconds) per iteration in $ t $ (ⅱ) ESS and (ⅲ) ESS/runtime, vs the number of particles $ M \in \{ 100,200,300,400,500,600,700,800,900, 1000 \} $. The AM-PF algorithms (IAPF, OAPF) clearly suffer from the $ M^2 $ computational cost as discussed in Section 3. Recall that ESS is bounded by $ M $. Also, notice how the ESS for OAPF grows almost linearly as a function of $ M $.

    Table 1.  Values of the Pearson $ \chi^2 $ divergence between the mixture proposal and the target; lower is better

    Algorithm $ \chi^2 $(Fig. 1a) $ \chi^2 $(Fig. 1b) $ \chi^2 $(Fig. 1c) $ \chi^2 $(Fig. 1d)
    BPF 1.89 0.22 2.31 1.72
    APF 0.30 0.16 0.27 0.36
    IAPF 0.12 0.24 0.31 0.28
    OAPF 0.0043 0.09 0.08 0.08
     | Show Table
    DownLoad: CSV
  • [1] Ö. D. Akyildiz and J. Míguez, Nudging the particle filter, Statistics and Computing, 30 (2020), 305-330.  doi: 10.1007/s11222-019-09884-y.
    [2] Ö. D. Akyildiz and J. Míguez, Convergence rates for optimised adaptive importance samplers, Statistics and Computing, 31 (2021), Paper No. 12, 17 pp. doi: 10.1007/s11222-020-09983-1.
    [3] A. Bouchard-CôtéS. Sankararaman and M. I. Jordan, Phylogenetic inference via sequential Monte Carlo, Systematic Biology, 61 (2012), 579-593. 
    [4] N. Branchini and V. Elvira, Optimized auxiliary particle filters: Adapting mixture proposals via convex optimization, Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, Proceedings of Machine Learning Research, 161 (2021), 1289-1299, https://proceedings.mlr.press/v161/branchini21a.html.
    [5] O. CappéR. DoucA. GuillinJ.-M. Marin and C. P. Robert, Adaptive importance sampling in general mixture classes, Statistics and Computing, 18 (2008), 447-459.  doi: 10.1007/s11222-008-9059-x.
    [6] L. CarloneJ. DuM. K. NgB. Bona and M. Indri, Active SLAM and exploration with particle filters using Kullback-Leibler divergence, Journal of Intelligent & Robotic Systems, 75 (2014), 291-311.  doi: 10.1007/s10846-013-9981-9.
    [7] N. Chopin, Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference, The Annals of Statistics, 32 (2004), 2385-2411.  doi: 10.1214/009053604000000698.
    [8] N. Chopin and O. Papaspiliopoulos, An Introduction to Sequential Monte Carlo, Springer Ser. Statist., Springer, Cham, 2020. doi: 10.1007/978-3-030-47845-2.
    [9] A. Corenflos, J. Thornton, G. Deligiannidis and A. Doucet, Differentiable particle filtering via entropy-regularized optimal transport, International Conference on Machine Learning, (2021), 2100-2111.
    [10] J. CornebiseE. Moulines and J. Olsson, Adaptive sequential Monte Carlo by means of mixture of experts, Statistics and Computing, 24 (2014), 317-337.  doi: 10.1007/s11222-012-9372-2.
    [11] D. Creal, A survey of sequential Monte Carlo methods for economics and finance, Econometric Reviews, 31 (2012), 245-296.  doi: 10.1080/07474938.2011.607333.
    [12] D. Crisan and K. Li, Generalised particle filters with Gaussian mixtures, Stochastic Processes and their Applications, 125 (2015), 2643-2673.  doi: 10.1016/j.spa.2015.01.008.
    [13] F. R. Crucinio and A. M. Johansen, Properties of marginal sequential Monte Carlo methods, Statist. Probab. Lett., 203 (2023), Paper No. 109914, 8 pp. doi: 10.1016/j.spl.2023.109914.
    [14] C. DaiJ. HengP. E. Jacob and N. Whiteley, An invitation to sequential Monte Carlo samplers, Journal of the American Statistical Association, 117 (2022), 1587-1600.  doi: 10.1080/01621459.2022.2087659.
    [15] D. DardariP. Closas and P. M Djurić, Indoor tracking: Theory, methods, and technologies, IEEE Transactions on Vehicular Technology, 64 (2015), 1263-1278.  doi: 10.1109/TVT.2015.2403868.
    [16] K. Daudel, Mixture weights optimisation for alpha-divergence variational inference, Advances in Neural Information Processing Systems, 34 (2021), 4397-4408. 
    [17] P. Del MoralA. Doucet and A. Jasra, Sequential Monte Carlo samplers, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68 (2006), 411-436.  doi: 10.1111/j.1467-9868.2006.00553.x.
    [18] P. M. DjuricJ. H. KotechaJ. ZhangY. HuangT. GhirmaiM. F. Bugallo and J. Miguez, Particle filtering, IEEE Signal Processing Magazine, 20 (2003), 19-38. 
    [19] R. Douc and O. Cappé, Comparison of resampling schemes for particle filtering, ISPA 2005. Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, (2005), 64-69.
    [20] R. DoucA. GuillinJ.-M. Marin and C. P. Robert, Minimum variance importance sampling via population Monte Carlo, ESAIM: Probability and Statistics, 11 (2007), 427-447.  doi: 10.1051/ps:2007028.
    [21] A. Doucet, On Sequential Simulation-based Methods for Bayesian Filtering, Department of Engineering, University of Cambridge, 1998.
    [22] A. Doucet and A. M. Johansen, A tutorial on particle filtering and smoothing: Fifteen years later, Handbook of Nonlinear Filtering, 12 (2009), 656-704. 
    [23] A. Doucet and A. Lee, Sequential Monte Carlo methods, Handbook of Graphical Models, Chapman & amp; Hall/CRC Handb. Mod. Stat. Methods CRC Press, Boca Raton, FL, (2018), 165-189.
    [24] A. DoucetS. Godsill and C. Andrieu, On sequential Monte Carlo sampling methods for Bayesian filtering, Statistics and Computing, 10 (2000), 197-208.  doi: 10.1093/oso/9780199278657.003.0022.
    [25] J. Du, L. Carlone, M. K. Ng, B. Bona and M. Indri, A comparative study on active SLAM and autonomous exploration with particle filters, 2011 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), (2011), 916-923.
    [26] V. ElviraL. MartinoD. Luengo and M. F. Bugallo, Efficient multiple importance sampling estimators, IEEE Signal Processing Letters, 22 (2015), 1757-1761.  doi: 10.1109/LSP.2015.2432078.
    [27] V. Elvira, L. Martino, M. F. Bugallo and P. M. Djurić, In search for improved auxiliary particle filters, 2018 26th European Signal Processing Conference (EUSIPCO), (2018), 1637-1641. doi: 10.23919/EUSIPCO.2018.8553361.
    [28] V. ElviraL. MartinoM. F. Bugallo and P. M. Djuric, Elucidating the auxiliary particle filter via multiple importance sampling [lecture notes], IEEE Signal Processing Magazine, 36 (2019), 145-152.  doi: 10.1109/MSP.2019.2938026.
    [29] V. ElviraL. MartinoD. Luengo and M. F. Bugallo, Generalized multiple importance sampling, Statistical Science, 34 (2019), 129-155.  doi: 10.1214/18-STS668.
    [30] V. ElviraL. Martino and C. P. Robert, Rethinking the effective sample size, International Statistical Review, 90 (2022), 525-550.  doi: 10.1111/insr.12500.
    [31] P. Fearnhead, Sequential Monte Carlo Methods in Filter Theory, PhD thesis, University of Oxford, 1998.
    [32] P. Fearnhead and H. R. Künsch, Particle filters and data assimilation, Annual Review of Statistics and Its Application, 5 (2018), 421-449.  doi: 10.1146/annurev-statistics-031017-100232.
    [33] F. L. FernándezL. MartinoV. ElviraD. Delgado and J. López-Santiago, Adaptive quadrature schemes for Bayesian inference via active learning, IEEE Access, 8 (2020), 208462-208483. 
    [34] D. Fox, KLD-sampling: Adaptive particle filters, Advances in Neural Information Processing Systems, (2001), 713. doi: 10.7551/mitpress/1120.003.0096.
    [35] W. R. Gilks and C. Berzuini, Following a moving target-Monte Carlo inference for dynamic Bayesian models, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63 (2001), 127-146.  doi: 10.1111/1467-9868.00280.
    [36] S. Godsill, Particle filtering: The first 25 years and beyond, ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), (2019), 7760-7764. doi: 10.1109/ICASSP.2019.8683411.
    [37] S. Godsill and T. Clapp, Improvement strategies for Monte Carlo particle filters, Sequential Monte Carlo Methods in Practice, Stat. Eng. Inf. Sci. Springer-Verlag, New York, (2001), 139-158.
    [38] N. J. GordonD. J. Salmond and A. F. M. Smith, Novel approach to nonlinear/non-Gaussian Bayesian state estimation, IEE proceedings-F (Radar and Signal Processing), 140 (1993), 107-113.  doi: 10.1049/ip-f-2.1993.0015.
    [39] P. GuarnieroA. M. Johansen and A. Lee, The iterated auxiliary particle filter, Journal of the American Statistical Association, 112 (2017), 1636-1647.  doi: 10.1080/01621459.2016.1222291.
    [40] J. HengA. N. BishopG. Deligiannidis and A. Doucet, Controlled sequential Monte Carlo, Annals of Statistics, 48 (2020), 2904-2929.  doi: 10.1214/19-AOS1914.
    [41] J. D. Hol, T. B. Schon and F. Gustafsson, On resampling algorithms for particle filters, 2006 IEEE Nonlinear Statistical Signal Processing Workshop, (2006), 79-82. doi: 10.1109/NSSPW.2006.4378824.
    [42] A. M. Johansen, Sequential monte carlo: Particle filters and beyond, Wiley StatsRef: Statistics Reference Online, (2014), 1-16. doi: 10.1002/9781118445112.stat08282.
    [43] A. M. Johansen and A. Doucet, A note on auxiliary particle filters, Statistics & Probability Letters, 78 (2008), 1498-1504.  doi: 10.1016/j.spl.2008.01.032.
    [44] M. Klaas, N. de Freitas and A. Doucet, Toward practical N$^2$ Monte Carlo: The marginal particle filter, Proceedings of the Twenty-First Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-05), (2005), 308-315.
    [45] H. Koptagel, O. Kviman, H. Melin, N. Safinianaini and J. Lagergren, VaiPhy: A variational inference based algorithm for phylogeny, Advances in Neural Information Processing Systems, (2022).
    [46] J. Kronander and T. B. Schön, Robust auxiliary particle filters using multiple importance sampling, 2014 IEEE Workshop on Statistical Signal Processing (SSP), (2014), 268-271.
    [47] J. Kuntz, F. R. Crucinio and A. M. Johansen, Product-form estimators: Exploiting independence to scale up Monte Carlo, Statistics and Computing, 32 (2022), Paper No. 12, 22 pp. doi: 10.1007/s11222-021-10069-9.
    [48] S. Lacoste-Julien, F. Lindsten and F. Bach, Sequential kernel herding: Frank-wolfe optimization for particle filtering, Artificial Intelligence and Statistics, (2015), 544-552.
    [49] J. Lai, J. Domke and D. Sheldon, Variational marginal particle filters, International Conference on Artificial Intelligence and Statistics, (2022), 875-895.
    [50] T. LiM. Bolic and P. M. Djuric, Resampling methods for particle filtering: Classification, implementation, and strategies, IEEE Signal Processing Magazine, 32 (2015), 70-86.  doi: 10.1109/MSP.2014.2330626.
    [51] W. LiR. Chen and Z. Tan, Efficient sequential Monte Carlo with multiple proposals and control variates, Journal of the American Statistical Association, 111 (2016), 298-313.  doi: 10.1080/01621459.2015.1006364.
    [52] Y. Li, L. Zhao and M. Coates, Particle flow for particle filtering, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), (2016), 3979-3983. doi: 10.1109/TSP.2017.2703684.
    [53] F. Lindsten and T. B. Schön, Foundations and Trends® in Machine Learning, Electronic Journal of Statistics, 6 (2013), 1-143.  doi: 10.1561/9781601986993.
    [54] J. S. Liu, Monte Carlo Strategies in Scientific Computing, Springer Ser. Statist., Springer-Verlag, New York, 2001.
    [55] N. Meinshausen, Sign-constrained least squares estimation for high-dimensional regression, Electronic Journal of Statistics, 7 (2013), 1607-1631.  doi: 10.1214/13-EJS818.
    [56] C. A. NaessethF. Lindsten and T. B. Schön, Elements of sequential Monte Carlo, Foundations and Trends® in Machine Learning, 12 (2019), 307-392.  doi: 10.1561/9781680836332.
    [57] A. B. Owen, Monte Carlo Theory, Methods and Examples, 2013.
    [58] S. Pérez-Vieites and J. Míguez, Nested Gaussian filters for recursive Bayesian inference and nonlinear tracking in state space models, Signal Processing, 189 (2021), 108295. 
    [59] M. K. Pitt and N. Shephard, Filtering via simulation: Auxiliary particle filters, Journal of the American Statistical Association, 94 (1999), 590-599.  doi: 10.1080/01621459.1999.10474153.
    [60] M. K. PittR. dos S. SilvaP. Giordani and R. Kohn, On some properties of Markov chain Monte Carlo simulation methods based on the particle filter, Journal of Econometrics, 171 (2012), 134-151.  doi: 10.1016/j.jeconom.2012.06.004.
    [61] S. Reich, A nonparametric ensemble transform method for Bayesian inference, SIAM Journal on Scientific Computing, 35 (2013), A2013-A2024.  doi: 10.1137/130907367.
    [62] H. RuzayqatA. Er-RaiyA. BeskosD. CrisanA. Jasra and N. Kantas, A lagged particle filter for stable filtering of certain high-dimensional state-space models, SIAM/ASA Journal on Uncertainty Quantification, 10 (2022), 1130-1161.  doi: 10.1137/21M1450392.
    [63] E. K. Ryu and S. P. Boyd, Adaptive importance sampling via stochastic convex programming, arXiv preprint, (2014), arXiv: 1412.4845.
    [64] S. Särkkä.Bayesian Filtering and Smoothing, IMS Textb., 3. Cambridge University Press, Cambridge, 2013.  doi: 10.1017/CBO9781139344203.
    [65] M. Slawski and M. Hein, Non-negative least squares for high-dimensional linear models: Consistency and sparse recovery without regularization, Electronic Journal of Statistics, 7 (2013), 3004-3056.  doi: 10.1214/13-EJS868.
    [66] O. Straka and J. Duník, Efficient implementation of marginal particle filter by functional density decomposition, 2022 25th International Conference on Information Fusion (FUSION), (2022), 01-08. doi: 10.23919/FUSION49751.2022.9841367.
    [67] S. Thrun, Particle filters in robotics, Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence, Morgan Kaufmann Publishers Inc., (2002), 511-518.
    [68] S. Thrun, D. Fox and W. Burgard, Monte Carlo localization with mixture proposal distribution, AAAI/IAAI, (2000), 859-865.
    [69] S. ThrunD. FoxW. Burgard and F. Dellaert, Robust Monte Carlo localization for mobile robots, Artificial Intelligence, 128 (2001), 99-141. 
    [70] P. TichavskỳO. Straka and J. Duník, Grid-based Bayesian filters with functional decomposition of transient density, IEEE Transactions on Signal Processing, 71 (2023), 92-104.  doi: 10.1109/TSP.2023.3240359.
    [71] L. M. M. van den Bos, B. Sanderse and W. A. A. M. Bierbooms, Adaptive sampling-based quadrature rules for efficient Bayesian prediction, Journal of Computational Physics, 417 (2020), 109537, 27 pp. doi: 10.1016/j.jcp.2020.109537.
    [72] P. J. Van LeeuwenH. R. KünschL. NergerR. Potthast and S. Reich, Particle filters for high-dimensional geoscience applications: A review, Quarterly Journal of the Royal Meteorological Society, 145 (2019), 2335-2365.  doi: 10.1002/qj.3551.
    [73] N. Whiteley and A. M. Johansen, Auxiliary particle filtering: Recent developments, Bayesian Time Series Models, Cambridge University Press, Cambridge, (2011), 52-81. doi: 10.1080/17498430903407932.
    [74] N. Whiteley and A. Lee, Twisted particle filters, The Annals of Statistics, 42 (2014), 115-141.  doi: 10.1214/13-AOS1167.
    [75] A. WigrenJ. WågbergF. LindstenA. G. Wills and T. B. Schön, Nonlinear system identification: Learning while respecting physical models using a sequential Monte Carlo method, IEEE Control Systems Magazine, 42 (2022), 75-102.  doi: 10.1109/MCS.2021.3122269.
    [76] A. G. Wills and T. B. Schön, Sequential Monte Carlo: A unified review, Annual Review of Control, Robotics, and Autonomous Systems, 6 (2023). doi: 10.1146/annurev-control-042920-015119.
    [77] T. YangP. G. Mehta and S. P. Meyn, Feedback particle filter, IEEE Transactions on Automatic Control, 58 (2013), 2465-2480.  doi: 10.1109/TAC.2013.2258825.
    [78] G. Zanella and G. Roberts, Scalable importance tempering and Bayesian variable selection, Journal of the Royal Statistical Society Series B: Statistical Methodology, 81 (2019), 489-517.  doi: 10.1111/rssb.12316.
  • 加载中




Article Metrics

HTML views(163) PDF downloads(99) Cited by(0)

Access History

Other Articles By Authors



    DownLoad:  Full-Size Img  PowerPoint