doi: 10.3934/fods.2020015

Online learning of both state and dynamics using ensemble Kalman filters

CEREA, joint laboratory École des Ponts ParisTech and EDF R & D, Université Paris-Est, Champs-sur-Marne, France

* Corresponding author: Marc Bocquet

Received  June 2020 Revised  August 2020 Published  September 2020

The reconstruction of the dynamics of an observed physical system as a surrogate model has been brought to the fore by recent advances in machine learning. To deal with partial and noisy observations in that endeavor, machine learning representations of the surrogate model can be used within a Bayesian data assimilation framework. However, these approaches require to consider long time series of observational data, meant to be assimilated all together. This paper investigates the possibility to learn both the dynamics and the state online, i.e. to update their estimates at any time, in particular when new observations are acquired. The estimation is based on the ensemble Kalman filter (EnKF) family of algorithms using a rather simple representation for the surrogate model and state augmentation. We consider the implication of learning dynamics online through (ⅰ) a global EnKF, (ⅰ) a local EnKF and (ⅲ) an iterative EnKF and we discuss in each case issues and algorithmic solutions. We then demonstrate numerically the efficiency and assess the accuracy of these methods using one-dimensional, one-scale and two-scale chaotic Lorenz models.

Citation: Marc Bocquet, Alban Farchi, Quentin Malartic. Online learning of both state and dynamics using ensemble Kalman filters. Foundations of Data Science, doi: 10.3934/fods.2020015
References:
[1]

H. D. I. AbarbanelP. J. Rozdeba and S. Shirman, Machine learning: Deepest learning as statistical data assimilation problems, Neural Computation, 30 (2018), 2025-2055.  doi: 10.1162/neco_a_01094.  Google Scholar

[2]

A. AksoyF. Zhang and J. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation in a two-dimensional sea-breeze model, Mon. Wea. Rev., 134 (2006), 2951-2969.  doi: 10.1175/MWR3224.1.  Google Scholar

[3]

T. Arcomano, I. Szunyogh, J. Pathak, A. Wikner, B. R. Hunt and E. Ott, A machine learning-based global atmospheric forecast model, Geophys. Res. Lett., 47 (2020), e2020GL087776. Google Scholar

[4]

C. H. BishopB. J. Etherton and S. J. Majumdar, Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects, Mon. Wea. Rev., 129 (2001), 420-436.  doi: 10.1175/1520-0493(2001)129<0420:ASWTET>2.0.CO;2.  Google Scholar

[5]

C. H. BishopJ. S. Whitaker and L. Lei, Gain form of the ensemble transform Kalman filter and its relevance to satellite data assimilation with model space ensemble covariance localization, Mon. Wea. Rev., 145 (2017), 4575-4592.  doi: 10.1175/MWR-D-17-0102.1.  Google Scholar

[6]

C. M. Bishop, Training with noise is equivalent to Tikhonov regularization, Neural Computation, 7 (1995), 108-116.  doi: 10.1162/neco.1995.7.1.108.  Google Scholar

[7]

M. Bocquet, Ensemble Kalman filtering without the intrinsic need for inflation, Nonlin. Processes Geophys., 18 (2011), 735-750.  doi: 10.5194/npg-18-735-2011.  Google Scholar

[8]

M. Bocquet, Localization and the iterative ensemble Kalman smoother, Q. J. R. Meteorol. Soc., 142 (2016), 1075-1089.  doi: 10.1002/qj.2711.  Google Scholar

[9]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Data assimilation as a learning tool to infer ordinary differential equation representations of dynamical models, Nonlin. Processes Geophys., 26 (2019), 143-162.  doi: 10.5194/npg-26-143-2019.  Google Scholar

[10]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization, Foundations of Data Science, 2 (2020), 55-80.  doi: 10.3934/fods.2020004.  Google Scholar

[11]

M. Bocquet and A. Carrassi, Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69 (2017), 1304504. doi: 10.1080/16000870.2017.1304504.  Google Scholar

[12]

M. Bocquet and A. Farchi, On the consistency of the perturbation update of local ensemble square root Kalman filters, Tellus A, 71 (2019), 1-21.  doi: 10.1080/16000870.2019.1613142.  Google Scholar

[13]

M. BocquetK. S. GurumoorthyA. ApteA. CarrassiC. Grudzien and C. K. R. T. Jones, Degenerate Kalman filter error covariances and their convergence onto the unstable subspace, SIAM/ASA J. Uncertainty Quantification, 5 (2017), 304-333.  doi: 10.1137/16M1068712.  Google Scholar

[14]

M. Bocquet and P. Sakov, Combining inflation-free and iterative ensemble Kalman filters for strongly nonlinear systems, Nonlin. Processes Geophys., 19 (2012), 383-399.  doi: 10.5194/npg-19-383-2012.  Google Scholar

[15]

M. Bocquet and P. Sakov, Joint state and parameter estimation with an iterative ensemble Kalman smoother, Nonlin. Processes Geophys., 20 (2013), 803-818.  doi: 10.5194/npg-20-803-2013.  Google Scholar

[16]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: A case study with the Lorenz 96 model, J. Comput. Sci., 44 (2020), 101171. doi: 10.1016/j.jocs.2020.101171.  Google Scholar

[17]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to infer unresolved scale parametrisation, Philosophical Transactions A, 0 (2020), 0, Submitted, arXiv preprint: arXiv: 2009.04318. Google Scholar

[18]

S. L. BruntonJ. L. Proctor and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, PNAS, 113 (2016), 3932-3937.  doi: 10.1073/pnas.1517384113.  Google Scholar

[19]

M. CarluF. GinelliV. Lucarini and A. Politi, Lyapunov analysis of multiscale dynamics: The slow bundle of the two-scale Lorenz 96 model, Nonlin. Processes Geophys., 26 (2019), 73-89.  doi: 10.5194/npg-26-73-2019.  Google Scholar

[20]

A. Carrassi, M. Bocquet, L. Bertino and G. Evensen, Data assimilation in the geosciences: An overview on methods, issues, and perspectives, WIREs Climate Change, 9 (2018), e535. doi: 10.1002/wcc.535.  Google Scholar

[21]

C. L. DefforgeB. CarissimoM. BocquetR. Bresson and P. Armand, Improving CFD atmospheric simulations at local scale for wind resource assessment using the iterative ensemble Kalman smoother, J. Wind. Eng. Ind. Aerod., 189 (2019), 243-257.  doi: 10.1016/j.jweia.2019.03.030.  Google Scholar

[22]

P. D. Dueben and P. Bauer, Challenges and design choices for global weather and climate models based on machine learning, Geosci. Model Dev., 11 (2018), 3999-4009.  doi: 10.5194/gmd-11-3999-2018.  Google Scholar

[23]

G. Evensen, Data Assimilation: The Ensemble Kalman Filter, 2$^nd$ edition, Springer-Verlag Berlin Heildelberg, 2009. doi: 10.1007/978-3-642-03711-5.  Google Scholar

[24]

R. Fablet, S. Ouala and C. Herzet, Bilinear residual neural network for the identification and forecasting of dynamical systems, in EUSIPCO 2018, European Signal Processing Conference, Rome, Italy, 2018, 1–5. Google Scholar

[25]

A. Farchi and M. Bocquet, On the efficiency of covariance localisation of the ensemble Kalman filter using augmented ensembles, Front. Appl. Math. Stat., 5 (2019), 3. doi: 10.3389/fams.2019.00003.  Google Scholar

[26]

E. Fertig, Observation bias correction with an ensemble Kalman filter, Tellus A, 61 (2009), 210-226.   Google Scholar

[27]

A. FillionM. BocquetS. GrattonS. Gürol and P. Sakov, An iterative ensemble Kalman smoother in presence of additive model error, SIAM/ASA J. Uncertainty Quantification, 8 (2020), 198-228.  doi: 10.1137/19M1244147.  Google Scholar

[28]

G. Gaspari and S. E. Cohn, Construction of correlation functions in two and three dimensions, Q. J. R. Meteorol. Soc., 125 (1999), 723-757.  doi: 10.1002/qj.49712555417.  Google Scholar

[29]

C. GrudzienA. Carrassi and M. Bocquet, Chaotic dynamics and the role of covariance inflation for reduced rank Kalman filters with model error, Nonlin. Processes Geophys., 25 (2018), 633-648.  doi: 10.5194/npg-25-633-2018.  Google Scholar

[30]

T. M. HamillJ. S. Whitaker and C. Snyder, Distance-dependent filtering of background error covariance estimates in an ensemble Kalman filter, Mon. Wea. Rev., 129 (2001), 2776-2790.  doi: 10.1175/1520-0493(2001)129<2776:DDFOBE>2.0.CO;2.  Google Scholar

[31]

R. A. Horn and C. R. Johnson, Matrix Analysis, 2$^{nd}$ edition, Cambridge University Press, 2013.  Google Scholar

[32]

P. L. Houtekamer and H. L. Mitchell, A sequential ensemble Kalman filter for atmospheric data assimilation, Mon. Wea. Rev., 129 (2001), 123-137.  doi: 10.1175/1520-0493(2001)129<0123:ASEKFF>2.0.CO;2.  Google Scholar

[33]

W. W. Hsieh and B. Tang, Applying neural network models to prediction and data analysis in meteorology and oceanography, Bull. Amer. Meteor. Soc., 79 (1998), 1855-1870.  doi: 10.1175/1520-0477(1998)079<1855:ANNMTP>2.0.CO;2.  Google Scholar

[34]

X.-M. Hu, F. Zhang and J. W. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation for treatment of mesoscale model error: A real-data study, Geophys. Res. Lett., 37 (2010), L08802. doi: 10.1029/2010GL043017.  Google Scholar

[35]

B. R. HuntE. J. Kostelich and I. Szunyogh, Efficient data assimilation for spatiotemporal chaos: A local ensemble transform Kalman filter, Physica D, 230 (2007), 112-126.  doi: 10.1016/j.physd.2006.11.008.  Google Scholar

[36] A. H. Jazwinski, Stochastic Processes and Filtering Theory, Academic Press, New-York, 1970.   Google Scholar
[37]

N. B. Kovachki and A. M. Stuart, Ensemble Kalman inversion: A derivative-free technique for machine learning tasks, Inverse Problems, 35 (2019), 095005. doi: 10.1088/1361-6420/ab1c3a.  Google Scholar

[38]

H. Koyama and M. Watanabe, Reducing forecast errors due to model imperfections using ensemble Kalman filtering, Mon. Wea. Rev., 138 (2010), 3316-3332.  doi: 10.1175/2010MWR3067.1.  Google Scholar

[39]

R. LguensatP. TandeoP. AilliotM. Pulido and R. Fablet, The analog data assimilation, Mon. Wea. Rev., 145 (2017), 4093-4107.  doi: 10.1175/MWR-D-16-0441.1.  Google Scholar

[40]

Z. Long, Y. Lu, X. Ma and B. Dong, PDE-Net: Learning PDEs from data, in Proceedings of the 35th International Conference on Machine Learning, 2018. Google Scholar

[41]

E. N. Lorenz, Designing chaotic models, J. Atmos. Sci., 62 (2005), 1574-1587.  doi: 10.1175/JAS3430.1.  Google Scholar

[42]

E. N. Lorenz and K. A. Emanuel, Optimal sites for supplementary weather observations: Simulation with a small model, J. Atmos. Sci., 55 (1998), 399-414.  doi: 10.1175/1520-0469(1998)055<0399:OSFSWO>2.0.CO;2.  Google Scholar

[43]

T. Miyoshi, The Gaussian approach to adaptive covariance inflation and its implementation with the local ensemble transform Kalman filter, Mon. Wea. Rev., 139 (2011), 1519-1535.  doi: 10.1175/2010MWR3570.1.  Google Scholar

[44]

E. Ott, A local ensemble Kalman filter for atmospheric data assimilation, Tellus A, 56 (2004), 415-428.  doi: 10.1016/j.physd.2006.11.008.  Google Scholar

[45]

J. PaduartL. LauwersJ. SweversK. SmoldersJ. Schoukens and R. Pintelon, Identification of nonlinear systems using polynomial nonlinear state space models, Automatica, 46 (2010), 647-656.  doi: 10.1016/j.automatica.2010.01.001.  Google Scholar

[46]

J. Pathak, B. Hunt, M. Girvan, Z. Lu and E. Ott, Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach, Phys. Rev. Lett., 120 (2018), 024102. doi: 10.1103/PhysRevLett.120.024102.  Google Scholar

[47]

P. N. RaanesM. Bocquet and A. Carrassi, Adaptive covariance inflation in the ensemble Kalman filter by Gaussian scale mixtures, Q. J. R. Meteorol. Soc., 145 (2019), 53-75.  doi: 10.1002/qj.3386.  Google Scholar

[48]

P. N. RaanesA. Carrassi and L. Bertino, Extending the square root method to account for additive forecast noise in ensemble methods, Mon. Wea. Rev., 143 (2015), 3857-38730.  doi: 10.1175/MWR-D-14-00375.1.  Google Scholar

[49]

S. Rasp, Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: General algorithms and Lorenz96 case study (v1.0), Geosci. Model Dev., 13 (2020), 2185-2196.  doi: 10.5194/gmd-2019-319.  Google Scholar

[50]

Y. M. Ruckstuhl and T. Janjić, Parameter and state estimation with ensemble Kalman filter based algorithms for convective-scale applications, Q. J. R. Meteorol. Soc., 144 (2018), 826-841.  doi: 10.1002/qj.3257.  Google Scholar

[51]

J. J. RuizM. Pulido and T. Miyoshi, Estimating model parameters with ensemble-based data assimilation: A review, J. Meteorol. Soc. Japan, 91 (2013), 79-99.  doi: 10.2151/jmsj.2013-201.  Google Scholar

[52]

P. Sakov and L. Bertino, Relation between two common localisation methods for the EnKF, Comput. Geosci., 15 (2011), 225-237.  doi: 10.1007/s10596-010-9202-6.  Google Scholar

[53]

P. SakovJ.-M. Haussaire and M. Bocquet, An iterative ensemble Kalman filter in presence of additive model error, Q. J. R. Meteorol. Soc., 144 (2018), 1297-1309.  doi: 10.1002/qj.3213.  Google Scholar

[54]

P. SakovD. S. Oliver and L. Bertino, An iterative EnKF for strongly nonlinear systems, Mon. Wea. Rev., 140 (2012), 1988-2004.  doi: 10.1175/MWR-D-11-00176.1.  Google Scholar

[55]

S. Scher and G. Messori, Generalization properties of feed-forward neural networks trained on Lorenz systems, Nonlin. Processes Geophys., 26 (2019), 381-399.  doi: 10.5194/npg-26-381-2019.  Google Scholar

[56]

J. A. WeynD. R. Durran and R. Caruana, Using deep learning to predict gridded 500-hPa geopotential height from historical weather data, Journal of Advances in Modeling Earth Systems, 11 (2019), 2680-2693.   Google Scholar

[57]

J. S. Whitaker and T. M. Hamill, Ensemble data assimilation without perturbed observations, Mon. Wea. Rev., 130 (2002), 1913-1924.  doi: 10.1175/1520-0493(2002)130<1913:EDAWPO>2.0.CO;2.  Google Scholar

show all references

References:
[1]

H. D. I. AbarbanelP. J. Rozdeba and S. Shirman, Machine learning: Deepest learning as statistical data assimilation problems, Neural Computation, 30 (2018), 2025-2055.  doi: 10.1162/neco_a_01094.  Google Scholar

[2]

A. AksoyF. Zhang and J. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation in a two-dimensional sea-breeze model, Mon. Wea. Rev., 134 (2006), 2951-2969.  doi: 10.1175/MWR3224.1.  Google Scholar

[3]

T. Arcomano, I. Szunyogh, J. Pathak, A. Wikner, B. R. Hunt and E. Ott, A machine learning-based global atmospheric forecast model, Geophys. Res. Lett., 47 (2020), e2020GL087776. Google Scholar

[4]

C. H. BishopB. J. Etherton and S. J. Majumdar, Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects, Mon. Wea. Rev., 129 (2001), 420-436.  doi: 10.1175/1520-0493(2001)129<0420:ASWTET>2.0.CO;2.  Google Scholar

[5]

C. H. BishopJ. S. Whitaker and L. Lei, Gain form of the ensemble transform Kalman filter and its relevance to satellite data assimilation with model space ensemble covariance localization, Mon. Wea. Rev., 145 (2017), 4575-4592.  doi: 10.1175/MWR-D-17-0102.1.  Google Scholar

[6]

C. M. Bishop, Training with noise is equivalent to Tikhonov regularization, Neural Computation, 7 (1995), 108-116.  doi: 10.1162/neco.1995.7.1.108.  Google Scholar

[7]

M. Bocquet, Ensemble Kalman filtering without the intrinsic need for inflation, Nonlin. Processes Geophys., 18 (2011), 735-750.  doi: 10.5194/npg-18-735-2011.  Google Scholar

[8]

M. Bocquet, Localization and the iterative ensemble Kalman smoother, Q. J. R. Meteorol. Soc., 142 (2016), 1075-1089.  doi: 10.1002/qj.2711.  Google Scholar

[9]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Data assimilation as a learning tool to infer ordinary differential equation representations of dynamical models, Nonlin. Processes Geophys., 26 (2019), 143-162.  doi: 10.5194/npg-26-143-2019.  Google Scholar

[10]

M. BocquetJ. BrajardA. Carrassi and L. Bertino, Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization, Foundations of Data Science, 2 (2020), 55-80.  doi: 10.3934/fods.2020004.  Google Scholar

[11]

M. Bocquet and A. Carrassi, Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69 (2017), 1304504. doi: 10.1080/16000870.2017.1304504.  Google Scholar

[12]

M. Bocquet and A. Farchi, On the consistency of the perturbation update of local ensemble square root Kalman filters, Tellus A, 71 (2019), 1-21.  doi: 10.1080/16000870.2019.1613142.  Google Scholar

[13]

M. BocquetK. S. GurumoorthyA. ApteA. CarrassiC. Grudzien and C. K. R. T. Jones, Degenerate Kalman filter error covariances and their convergence onto the unstable subspace, SIAM/ASA J. Uncertainty Quantification, 5 (2017), 304-333.  doi: 10.1137/16M1068712.  Google Scholar

[14]

M. Bocquet and P. Sakov, Combining inflation-free and iterative ensemble Kalman filters for strongly nonlinear systems, Nonlin. Processes Geophys., 19 (2012), 383-399.  doi: 10.5194/npg-19-383-2012.  Google Scholar

[15]

M. Bocquet and P. Sakov, Joint state and parameter estimation with an iterative ensemble Kalman smoother, Nonlin. Processes Geophys., 20 (2013), 803-818.  doi: 10.5194/npg-20-803-2013.  Google Scholar

[16]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to emulate a dynamical model from sparse and noisy observations: A case study with the Lorenz 96 model, J. Comput. Sci., 44 (2020), 101171. doi: 10.1016/j.jocs.2020.101171.  Google Scholar

[17]

J. Brajard, A. Carrassi, M. Bocquet and L. Bertino, Combining data assimilation and machine learning to infer unresolved scale parametrisation, Philosophical Transactions A, 0 (2020), 0, Submitted, arXiv preprint: arXiv: 2009.04318. Google Scholar

[18]

S. L. BruntonJ. L. Proctor and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, PNAS, 113 (2016), 3932-3937.  doi: 10.1073/pnas.1517384113.  Google Scholar

[19]

M. CarluF. GinelliV. Lucarini and A. Politi, Lyapunov analysis of multiscale dynamics: The slow bundle of the two-scale Lorenz 96 model, Nonlin. Processes Geophys., 26 (2019), 73-89.  doi: 10.5194/npg-26-73-2019.  Google Scholar

[20]

A. Carrassi, M. Bocquet, L. Bertino and G. Evensen, Data assimilation in the geosciences: An overview on methods, issues, and perspectives, WIREs Climate Change, 9 (2018), e535. doi: 10.1002/wcc.535.  Google Scholar

[21]

C. L. DefforgeB. CarissimoM. BocquetR. Bresson and P. Armand, Improving CFD atmospheric simulations at local scale for wind resource assessment using the iterative ensemble Kalman smoother, J. Wind. Eng. Ind. Aerod., 189 (2019), 243-257.  doi: 10.1016/j.jweia.2019.03.030.  Google Scholar

[22]

P. D. Dueben and P. Bauer, Challenges and design choices for global weather and climate models based on machine learning, Geosci. Model Dev., 11 (2018), 3999-4009.  doi: 10.5194/gmd-11-3999-2018.  Google Scholar

[23]

G. Evensen, Data Assimilation: The Ensemble Kalman Filter, 2$^nd$ edition, Springer-Verlag Berlin Heildelberg, 2009. doi: 10.1007/978-3-642-03711-5.  Google Scholar

[24]

R. Fablet, S. Ouala and C. Herzet, Bilinear residual neural network for the identification and forecasting of dynamical systems, in EUSIPCO 2018, European Signal Processing Conference, Rome, Italy, 2018, 1–5. Google Scholar

[25]

A. Farchi and M. Bocquet, On the efficiency of covariance localisation of the ensemble Kalman filter using augmented ensembles, Front. Appl. Math. Stat., 5 (2019), 3. doi: 10.3389/fams.2019.00003.  Google Scholar

[26]

E. Fertig, Observation bias correction with an ensemble Kalman filter, Tellus A, 61 (2009), 210-226.   Google Scholar

[27]

A. FillionM. BocquetS. GrattonS. Gürol and P. Sakov, An iterative ensemble Kalman smoother in presence of additive model error, SIAM/ASA J. Uncertainty Quantification, 8 (2020), 198-228.  doi: 10.1137/19M1244147.  Google Scholar

[28]

G. Gaspari and S. E. Cohn, Construction of correlation functions in two and three dimensions, Q. J. R. Meteorol. Soc., 125 (1999), 723-757.  doi: 10.1002/qj.49712555417.  Google Scholar

[29]

C. GrudzienA. Carrassi and M. Bocquet, Chaotic dynamics and the role of covariance inflation for reduced rank Kalman filters with model error, Nonlin. Processes Geophys., 25 (2018), 633-648.  doi: 10.5194/npg-25-633-2018.  Google Scholar

[30]

T. M. HamillJ. S. Whitaker and C. Snyder, Distance-dependent filtering of background error covariance estimates in an ensemble Kalman filter, Mon. Wea. Rev., 129 (2001), 2776-2790.  doi: 10.1175/1520-0493(2001)129<2776:DDFOBE>2.0.CO;2.  Google Scholar

[31]

R. A. Horn and C. R. Johnson, Matrix Analysis, 2$^{nd}$ edition, Cambridge University Press, 2013.  Google Scholar

[32]

P. L. Houtekamer and H. L. Mitchell, A sequential ensemble Kalman filter for atmospheric data assimilation, Mon. Wea. Rev., 129 (2001), 123-137.  doi: 10.1175/1520-0493(2001)129<0123:ASEKFF>2.0.CO;2.  Google Scholar

[33]

W. W. Hsieh and B. Tang, Applying neural network models to prediction and data analysis in meteorology and oceanography, Bull. Amer. Meteor. Soc., 79 (1998), 1855-1870.  doi: 10.1175/1520-0477(1998)079<1855:ANNMTP>2.0.CO;2.  Google Scholar

[34]

X.-M. Hu, F. Zhang and J. W. Nielsen-Gammon, Ensemble-based simultaneous state and parameter estimation for treatment of mesoscale model error: A real-data study, Geophys. Res. Lett., 37 (2010), L08802. doi: 10.1029/2010GL043017.  Google Scholar

[35]

B. R. HuntE. J. Kostelich and I. Szunyogh, Efficient data assimilation for spatiotemporal chaos: A local ensemble transform Kalman filter, Physica D, 230 (2007), 112-126.  doi: 10.1016/j.physd.2006.11.008.  Google Scholar

[36] A. H. Jazwinski, Stochastic Processes and Filtering Theory, Academic Press, New-York, 1970.   Google Scholar
[37]

N. B. Kovachki and A. M. Stuart, Ensemble Kalman inversion: A derivative-free technique for machine learning tasks, Inverse Problems, 35 (2019), 095005. doi: 10.1088/1361-6420/ab1c3a.  Google Scholar

[38]

H. Koyama and M. Watanabe, Reducing forecast errors due to model imperfections using ensemble Kalman filtering, Mon. Wea. Rev., 138 (2010), 3316-3332.  doi: 10.1175/2010MWR3067.1.  Google Scholar

[39]

R. LguensatP. TandeoP. AilliotM. Pulido and R. Fablet, The analog data assimilation, Mon. Wea. Rev., 145 (2017), 4093-4107.  doi: 10.1175/MWR-D-16-0441.1.  Google Scholar

[40]

Z. Long, Y. Lu, X. Ma and B. Dong, PDE-Net: Learning PDEs from data, in Proceedings of the 35th International Conference on Machine Learning, 2018. Google Scholar

[41]

E. N. Lorenz, Designing chaotic models, J. Atmos. Sci., 62 (2005), 1574-1587.  doi: 10.1175/JAS3430.1.  Google Scholar

[42]

E. N. Lorenz and K. A. Emanuel, Optimal sites for supplementary weather observations: Simulation with a small model, J. Atmos. Sci., 55 (1998), 399-414.  doi: 10.1175/1520-0469(1998)055<0399:OSFSWO>2.0.CO;2.  Google Scholar

[43]

T. Miyoshi, The Gaussian approach to adaptive covariance inflation and its implementation with the local ensemble transform Kalman filter, Mon. Wea. Rev., 139 (2011), 1519-1535.  doi: 10.1175/2010MWR3570.1.  Google Scholar

[44]

E. Ott, A local ensemble Kalman filter for atmospheric data assimilation, Tellus A, 56 (2004), 415-428.  doi: 10.1016/j.physd.2006.11.008.  Google Scholar

[45]

J. PaduartL. LauwersJ. SweversK. SmoldersJ. Schoukens and R. Pintelon, Identification of nonlinear systems using polynomial nonlinear state space models, Automatica, 46 (2010), 647-656.  doi: 10.1016/j.automatica.2010.01.001.  Google Scholar

[46]

J. Pathak, B. Hunt, M. Girvan, Z. Lu and E. Ott, Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach, Phys. Rev. Lett., 120 (2018), 024102. doi: 10.1103/PhysRevLett.120.024102.  Google Scholar

[47]

P. N. RaanesM. Bocquet and A. Carrassi, Adaptive covariance inflation in the ensemble Kalman filter by Gaussian scale mixtures, Q. J. R. Meteorol. Soc., 145 (2019), 53-75.  doi: 10.1002/qj.3386.  Google Scholar

[48]

P. N. RaanesA. Carrassi and L. Bertino, Extending the square root method to account for additive forecast noise in ensemble methods, Mon. Wea. Rev., 143 (2015), 3857-38730.  doi: 10.1175/MWR-D-14-00375.1.  Google Scholar

[49]

S. Rasp, Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: General algorithms and Lorenz96 case study (v1.0), Geosci. Model Dev., 13 (2020), 2185-2196.  doi: 10.5194/gmd-2019-319.  Google Scholar

[50]

Y. M. Ruckstuhl and T. Janjić, Parameter and state estimation with ensemble Kalman filter based algorithms for convective-scale applications, Q. J. R. Meteorol. Soc., 144 (2018), 826-841.  doi: 10.1002/qj.3257.  Google Scholar

[51]

J. J. RuizM. Pulido and T. Miyoshi, Estimating model parameters with ensemble-based data assimilation: A review, J. Meteorol. Soc. Japan, 91 (2013), 79-99.  doi: 10.2151/jmsj.2013-201.  Google Scholar

[52]

P. Sakov and L. Bertino, Relation between two common localisation methods for the EnKF, Comput. Geosci., 15 (2011), 225-237.  doi: 10.1007/s10596-010-9202-6.  Google Scholar

[53]

P. SakovJ.-M. Haussaire and M. Bocquet, An iterative ensemble Kalman filter in presence of additive model error, Q. J. R. Meteorol. Soc., 144 (2018), 1297-1309.  doi: 10.1002/qj.3213.  Google Scholar

[54]

P. SakovD. S. Oliver and L. Bertino, An iterative EnKF for strongly nonlinear systems, Mon. Wea. Rev., 140 (2012), 1988-2004.  doi: 10.1175/MWR-D-11-00176.1.  Google Scholar

[55]

S. Scher and G. Messori, Generalization properties of feed-forward neural networks trained on Lorenz systems, Nonlin. Processes Geophys., 26 (2019), 381-399.  doi: 10.5194/npg-26-381-2019.  Google Scholar

[56]

J. A. WeynD. R. Durran and R. Caruana, Using deep learning to predict gridded 500-hPa geopotential height from historical weather data, Journal of Advances in Modeling Earth Systems, 11 (2019), 2680-2693.   Google Scholar

[57]

J. S. Whitaker and T. M. Hamill, Ensemble data assimilation without perturbed observations, Mon. Wea. Rev., 130 (2002), 1913-1924.  doi: 10.1175/1520-0493(2002)130<1913:EDAWPO>2.0.CO;2.  Google Scholar

Figure 1.  Average state RMSE for the EnKF-ML with an hybrid adaptive inflation scheme, applied to the L96 model, for a range of ensemble size $ {N_\mathrm{e}} $ and several values of the standard deviations of the initial parameter vector mean. The error bars correspond to the standard deviation of $ {N_\mathrm{exp}} = 100 $ repeated experiments
Figure 2.  Initial evolution of the $ {N_\mathrm{p}} = 18 $ parameters of the surrogate model learned on the observation of a L96 model run. The key parameters are the forcing ($ F = 8 $ in the true model), the friction ($ -1 $ in the true model) and the advection coefficients ($ -1, 1 $ in the true model)
Figure 3.  Left panel (a): For both models, comparison of the performance of the EnKF-ML where the model is unknown and of the traditional EnKF where the model is known. The error bars are obtained from the standard deviation of $ {N_\mathrm{exp}} = 50 $ repeated experiments. Right panel (b): Comparison of Lyapunov spectra of the L96 model and of the surrogate model about an L96 trajectory
Figure 4.  For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known. The error bars are obtained from the standard deviation of $ {N_\mathrm{exp}} = 10 $ repeated experiments
Figure 5.  Left panel (a): For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known, as a function of the observation density. Right panel (b): For both models, comparison of the performance of the LEnKF-ML where the model is unknown and of the traditional LEnKF where the model is known, as a function of the observation error standard deviation. In all cases, the ensemble size is $ {N_\mathrm{e}} = 24 $ and the RMSE statistics are accumulated over $ {N_\mathrm{exp}} = 10 $ experiments
Figure 6.  Left panel (a): Optimal tapering coefficient $ \zeta $ across a range of ensemble sizes $ {N_\mathrm{e}} $ for LEnKF-ML applied to the L96 model ($ {N_\mathrm{x}} = 40 $). Right panel (b): Optimal tapering coefficient $ \zeta $ across a range of model state space dimensions $ {N_\mathrm{x}} $ for LEnKF-ML applied to the L96 model, assuming $ {N_\mathrm{e}} = 40 $, a fixed multiplicative inflation and localization length
Figure 7.  Comparison for both models of the performance of the IEnKF-ML where the model is unknown and of the traditional IEnKF where the model is known, as a function of the time interval between updates $ \Delta t $. The absence of a data point means that at least one of the $ {N_\mathrm{e}} = 10 $ DA runs was divergent
[1]

Junyoung Jang, Kihoon Jang, Hee-Dae Kwon, Jeehyun Lee. Feedback control of an HBV model based on ensemble kalman filter and differential evolution. Mathematical Biosciences & Engineering, 2018, 15 (3) : 667-691. doi: 10.3934/mbe.2018030

[2]

Alexander Bibov, Heikki Haario, Antti Solonen. Stabilized BFGS approximate Kalman filter. Inverse Problems & Imaging, 2015, 9 (4) : 1003-1024. doi: 10.3934/ipi.2015.9.1003

[3]

Russell Johnson, Carmen Núñez. The Kalman-Bucy filter revisited. Discrete & Continuous Dynamical Systems - A, 2014, 34 (10) : 4139-4153. doi: 10.3934/dcds.2014.34.4139

[4]

Sebastian Reich, Seoleun Shin. On the consistency of ensemble transform filter formulations. Journal of Computational Dynamics, 2014, 1 (1) : 177-189. doi: 10.3934/jcd.2014.1.177

[5]

Neil K. Chada, Claudia Schillings, Simon Weissmann. On the incorporation of box-constraints for ensemble Kalman inversion. Foundations of Data Science, 2019, 1 (4) : 433-456. doi: 10.3934/fods.2019018

[6]

Le Yin, Ioannis Sgouralis, Vasileios Maroulas. Topological reconstruction of sub-cellular motion with Ensemble Kalman velocimetry. Foundations of Data Science, 2020, 2 (2) : 101-121. doi: 10.3934/fods.2020007

[7]

Mojtaba F. Fathi, Ahmadreza Baghaie, Ali Bakhshinejad, Raphael H. Sacho, Roshan M. D'Souza. Time-resolved denoising using model order reduction, dynamic mode decomposition, and kalman filter and smoother. Journal of Computational Dynamics, 2020, 7 (2) : 469-487. doi: 10.3934/jcd.2020019

[8]

Andrea Arnold, Daniela Calvetti, Erkki Somersalo. Vectorized and parallel particle filter SMC parameter estimation for stiff ODEs. Conference Publications, 2015, 2015 (special) : 75-84. doi: 10.3934/proc.2015.0075

[9]

Z. G. Feng, Kok Lay Teo, N. U. Ahmed, Yulin Zhao, W. Y. Yan. Optimal fusion of sensor data for Kalman filtering. Discrete & Continuous Dynamical Systems - A, 2006, 14 (3) : 483-503. doi: 10.3934/dcds.2006.14.483

[10]

Laura Martín-Fernández, Gianni Gilioli, Ettore Lanzarone, Joaquín Míguez, Sara Pasquali, Fabrizio Ruggeri, Diego P. Ruiz. A Rao-Blackwellized particle filter for joint parameter estimation and biomass tracking in a stochastic predator-prey system. Mathematical Biosciences & Engineering, 2014, 11 (3) : 573-597. doi: 10.3934/mbe.2014.11.573

[11]

Qiyu Jin, Ion Grama, Quansheng Liu. Convergence theorems for the Non-Local Means filter. Inverse Problems & Imaging, 2018, 12 (4) : 853-881. doi: 10.3934/ipi.2018036

[12]

Seung-Yeal Ha, Jaeseung Lee, Zhuchun Li. Emergence of local synchronization in an ensemble of heterogeneous Kuramoto oscillators. Networks & Heterogeneous Media, 2017, 12 (1) : 1-24. doi: 10.3934/nhm.2017001

[13]

Minlong Lin, Ke Tang. Selective further learning of hybrid ensemble for class imbalanced increment learning. Big Data & Information Analytics, 2017, 2 (1) : 1-21. doi: 10.3934/bdia.2017005

[14]

Marc Bocquet, Julien Brajard, Alberto Carrassi, Laurent Bertino. Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization. Foundations of Data Science, 2020, 2 (1) : 55-80. doi: 10.3934/fods.2020004

[15]

Lili Ju, Wei Leng, Zhu Wang, Shuai Yuan. Numerical investigation of ensemble methods with block iterative solvers for evolution problems. Discrete & Continuous Dynamical Systems - B, 2020, 25 (12) : 4905-4923. doi: 10.3934/dcdsb.2020132

[16]

Sebastian Springer, Heikki Haario, Vladimir Shemyakin, Leonid Kalachev, Denis Shchepakin. Robust parameter estimation of chaotic systems. Inverse Problems & Imaging, 2019, 13 (6) : 1189-1212. doi: 10.3934/ipi.2019053

[17]

Nicolas Augier, Ugo Boscain, Mario Sigalotti. Semi-conical eigenvalue intersections and the ensemble controllability problem for quantum systems. Mathematical Control & Related Fields, 2020, 10 (4) : 877-911. doi: 10.3934/mcrf.2020023

[18]

Ronan Costaouec, Haoyun Feng, Jesús Izaguirre, Eric Darve. Analysis of the accelerated weighted ensemble methodology. Conference Publications, 2013, 2013 (special) : 171-181. doi: 10.3934/proc.2013.2013.171

[19]

Ye Sun, Daniel B. Work. Error bounds for Kalman filters on traffic networks. Networks & Heterogeneous Media, 2018, 13 (2) : 261-295. doi: 10.3934/nhm.2018012

[20]

Young-Pil Choi, Seung-Yeal Ha, Javier Morales. Emergent dynamics of the Kuramoto ensemble under the effect of inertia. Discrete & Continuous Dynamical Systems - A, 2018, 38 (10) : 4875-4913. doi: 10.3934/dcds.2018213

 Impact Factor: 

Article outline

Figures and Tables

[Back to Top]