Advanced Search
Article Contents
Article Contents

Learning dynamical systems using local stability priors

Abstract / Introduction Full Text(HTML) Figure(9) Related Papers Cited by
  • A computational approach to simultaneously learn the vector field of a dynamical system with a locally asymptotically stable equilibrium and its region of attraction from the system's trajectories is proposed. The nonlinear identification leverages the local stability information as a prior on the system, effectively endowing the estimate with this important structural property. In addition, the knowledge of the region of attraction can be used to design experiments by informing the selection of initial conditions from which trajectories are generated and by enabling the use of a Lyapunov function of the system as a regularization term. Simulation results show that the proposed method allows efficient sampling and provides an accurate estimate of the dynamics in an inner approximation of its region of attraction.

    Mathematics Subject Classification: Primary: 93D05, 93B30; Secondary: 62J02.


    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  (a) One step of the iterative RoA estimation algorithm. The boundary of the RoA is learned as a boundary function of a classifier. The blue area is the current estimate of the RoA. The green area contains initial states that converge to the estimated RoA and eventually the equilibrium. The red area contains the initial states that diverge. Each step of the algorithm learns the boundary between stable and unstable regions. (b) For the vector field $ f $, $ f|_{\mathcal{R}_{\bar{x}}} $ denotes its restriction to the RoA, which is the objective of the ODE learning phase

    Figure 2.  Pre-training the randomly initialized neural network with a quadratic function. The background color shows the values of the Lyapunov function evaluated over a fine grid of points. Lighter colors correspond to larger values. The contours show the level sets

    Figure 3.  (Van der Pol Oscillator) The growth stages of the RoA estimation algorithm of Section 2.3. Color codes are as follows. Green: True RoA. Blue: Estimated RoA (the largest sublevel set of the learned Lyapunov function that satisfies the decrease condition (3)). Pink: The gap $ \mathcal{G} = \mathcal{S}_{\alpha c}(V(\cdot;\theta))\backslash \mathcal{S}_c(V(\cdot;\theta)) $ from which the initial states of the trajectories are picked

    Figure 4.  Sampled trajectories from inside the estimated RoA of each growth stage

    Figure 5.  (Van der Pol Oscillator) Left: True RoA. Right: The progressively learned ODE over the steps of the coupled algorithm. The darker background shows a larger mismatch between the learned and true vector fields

    Figure 6.  (Van der Pol Oscillator) The progressively learned ODE using neural networks

    Figure 7.  (Van der Pol Oscillator) Learned ODEs with and without the Lyapunov regularization term in the loss function

    Figure 8.  (Inverted Pendulum) Sampled trajectories from inside the estimated RoA of each growth stage

    Figure 9.  (Inverted Pendulum) The progressively learned ODE using neural networks

  • [1] F. Berkenkamp, R. Moriconi, A. P. Schoellig and A. Krause, Safe learning of regions of attraction for uncertain, nonlinear systems with gaussian processes, In IEEE 55th Conference on Decision and Control (CDC), 2016.
    [2] O. Boubaker, The inverted pendulum benchmark in nonlinear control theory: A survey, International J. Advanced Robotic Systems, 10 (2013), 233. 
    [3] C. CarmeliE. De Vito and A. Toigo, Vector valued reproducing kernel Hilbert spaces of integrable functions and Mercer theorem, Anal. Appl., 4 (2006), 377-408.  doi: 10.1142/S0219530506000838.
    [4] S. Chen, Optimal bandwidth selection for kernel density functionals estimation, J. Probab. Stat., 2015 (2015), Art. ID 242683, 21 pp. doi: 10.1155/2015/242683.
    [5] T. Q. Chen, Y. Rubanova, J. Bettencourt and D. K. Duvenaud, Neural ordinary differential equations, In Advances in Neural Information Processing Systems, (2018), 6571-6583.
    [6] G. Chesi, Domain of Attraction: Analysis and Control via SOS Programming, Springer, 2011.
    [7] A. Chiuso and G. Pillonetto, System identification: A machine learning perspective, Annual Review of Control, Robotics, and Autonomous Systems, 2 (2019), 281-304. 
    [8] T. EvgeniouM. Pontil and T. Poggio, Regularization networks and support vector machines, Adv. Comput. Math., 13 (2000), 1-50.  doi: 10.1023/A:1018946025316.
    [9] P. Giesl, Construction of a global lyapunov function using radial basis functions with a single operator, Discrete Contin. Dyn. Syst. Ser. B, 7 (2007), 101-124.  doi: 10.3934/dcdsb.2007.7.101.
    [10] P. Giesl and S. Hafstein, Construction of a CPA contraction metric for periodic orbits using semidefinite optimization, Nonlinear Anal., 86 (2013), 114-134.  doi: 10.1016/j.na.2013.03.012.
    [11] P. Giesl and S. Hafstein, Review on computational methods for Lyapunov functions, Discrete Contin. Dyn. Syst. Ser. B, 20 (2015), 2291-2331.  doi: 10.3934/dcdsb.2015.20.2291.
    [12] P. GieslB. HamziM. Rasmussen and K. Webster, Approximation of Lyapunov functions from noisy data, J. Comput. Dyn., 7 (2020), 57-81.  doi: 10.3934/jcd.2020003.
    [13] L. Grüne, Computing Lyapunov functions using deep neural networks, J. Comput. Dyn., 8 (2021), 131-152.  doi: 10.3934/jcd.2021006.
    [14] A. Iannelli, P. Seiler and A. Marcos, Region of attraction analysis with integral quadratic constraints, Automatica, 109 (2019), 108543, 10 pp. doi: 10.1016/j.automatica.2019.108543.
    [15] H. K. Khalil, Nonlinear Systems, Prentice Hall, 1996.
    [16] M. Korda and I. Mezić, Optimal construction of Koopman eigenfunctions for prediction and control, IEEE Trans. Automat. Control, 65 (2020), 5114-5129. 
    [17] F. L. Lewis, D. Vrabie and V. L. Syrmos, Optimal Control, John Wiley & Sons, 2012. doi: 10.1002/9781118122631.
    [18] L. Ljung, System Identification: Theory for the User, Prentice Hall, Inc., Englewood Cliffs, NJ, 1987.
    [19] I. G. K. ClarenceO. W. Matthew and W. Rowley, A kernel-based method for data-driven koopman spectral analysis, J. Comput. Dyn., 2 (2015), 247-265.  doi: 10.3934/jcd.2015005.
    [20] E. NajafiR. Babuška and G. A. Lopes, A fast sampling method for estimating the domain of attraction, Nonlinear Dynam., 86 (2016), 823-834.  doi: 10.1007/s11071-016-2926-7.
    [21] P. A. Parrilo, Semidefinite programming relaxations for semialgebraic problems, Math. Program., 96 (2003), 293-320.  doi: 10.1007/s10107-003-0387-5.
    [22] F. Pedregosa, G. Varoquaux, A. Gramfort, et al., Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., 12 (2011), 2825-2830.
    [23] G. PillonettoF. DinuzzoT. ChenG. De Nicolao and L. Ljung, Kernel methods in system identification, machine learning and function estimation: A survey, Automatica, 50 (2014), 657-682.  doi: 10.1016/j.automatica.2014.01.001.
    [24] T. Poggio and C. R. Shelton, On the mathematical foundations of learning, American Mathematical Society, 39 (2002), 1-49. 
    [25] A. PoytonM. S. VarziriK. B. McAuleyP. J. McLellan and J. O. Ramsay, Parameter estimation in continuous-time dynamic models using principal differential analysis, Computers Chemical Engineering, 30 (2006), 698-708. 
    [26] M. Revay and I. Manchester, Contracting implicit recurrent neural networks: Stable models with improved trainability, In Learning for Dynamics and Control, PMLR, (2020), 393-403.
    [27] M. RevayR. Wang and I. R. Manchester, A convex parameterization of robust recurrent neural networks, IEEE Control Syst. Lett., 5 (2021), 1363-1368. 
    [28] S. M. Richards, F. Berkenkamp and A. Krause, The Lyapunov neural network: Adaptive stability certification for safe learning of dynamical systems, In Conference on Robot Learning, PMLR, (2018), 466-476.
    [29] Y. Rubanova, T. Q. Chen and D. K. Duvenaud, Latent ordinary differential equations for irregularly-sampled time series, In Advances in Neural Information Processing Systems, (2019), 5321-5331.
    [30] J. Schoukens and L. Ljung, Nonlinear system identification: A user-oriented road map, IEEE Control Syst., 39 (2019), 28-99. 
    [31] S. Smale and D.-X. Zhou, Shannon sampling and function reconstruction from point values, Bull. Amer. Math. Soc. (N.S.), 41 (2004), 279-305.  doi: 10.1090/S0273-0979-04-01025-0.
    [32] I. Steinwart and A. Christmann, Support Vector Machines, Springer Science & Business Media, 2008.
    [33] M. M. TobenkinI. R. Manchester and A. Megretski, Convex parameterizations and fidelity bounds for nonlinear identification and reduced-order modelling, IEEE Trans. Automat. Control, 62 (2017), 3679-3686.  doi: 10.1109/TAC.2017.2694820.
    [34] U. TopcuA. Packard and P. Seiler, Local stability analysis using simulations and sum-of-squares programming, Automatica, 44 (2008), 2669-2675.  doi: 10.1016/j.automatica.2008.03.010.
    [35] G. Valmorbida and J. Anderson, Region of attraction estimation using invariant sets and rational Lyapunov functions, Automatica, 75 (2017), 37-45.  doi: 10.1016/j.automatica.2016.09.003.
  • 加载中



Article Metrics

HTML views(2091) PDF downloads(382) Cited by(0)

Access History

Other Articles By Authors



    DownLoad:  Full-Size Img  PowerPoint