doi: 10.3934/dcdss.2021098
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Augmented Gaussian random field: Theory and computation

1. 

Department of Mathematics, Purdue University, West Lafayette, IN 47907, USA

2. 

Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA 18015, USA

3. 

School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907, USA

Received  March 2021 Revised  June 2021 Early access August 2021

We propose the novel augmented Gaussian random field (AGRF), which is a universal framework incorporating the data of observable and derivatives of any order. Rigorous theory is established. We prove that under certain conditions, the observable and its derivatives of any order are governed by a single Gaussian random field, which is the aforementioned AGRF. As a corollary, the statement "the derivative of a Gaussian process remains a Gaussian process" is validated, since the derivative is represented by a part of the AGRF. Moreover, a computational method corresponding to the universal AGRF framework is constructed. Both noiseless and noisy scenarios are considered. Formulas of the posterior distributions are deduced in a nice closed form. A significant advantage of our computational method is that the universal AGRF framework provides a natural way to incorporate arbitrary order derivatives and deal with missing data. We use four numerical examples to demonstrate the effectiveness of the computational method. The numerical examples are composite function, damped harmonic oscillator, Korteweg-De Vries equation, and Burgers' equation.

Citation: Sheng Zhang, Xiu Yang, Samy Tindel, Guang Lin. Augmented Gaussian random field: Theory and computation. Discrete & Continuous Dynamical Systems - S, doi: 10.3934/dcdss.2021098
References:
[1]

P. Abrahamsen and N. regnesentral, A Review of Gaussian Random Fields and Correlation Functions, Norsk Regnesentral/Norwegian Computing Center, 1997. Google Scholar

[2]

R. J. Adler, The Geometry of Random Fields, Society for Industrial and Applied Mathematics, 2010. doi: 10.1137/1.9780898718980.ch1.  Google Scholar

[3]

O. A. ChkrebtiiD. A. CampbellB. Calderhead and M. A. Girolami, Bayesian solution uncertainty quantification for differential equations, Bayesian Analysis, 11 (2016), 1239-1267.  doi: 10.1214/16-BA1017.  Google Scholar

[4]

H.-S. Chung and J. Alonso, Using gradients to construct cokriging approximation models for high-dimensional design optimization problems, in 40th AIAA Aerospace Sciences Meeting & Exhibit, American Institute of Aeronautics and Astronautics, 2002. doi: 10.2514/6.2002-317.  Google Scholar

[5]

J. Cockayne, C. J. Oates, T. J. Sullivan and M. Girolami, Bayesian probabilistic numerical methods, SIAM Review, 61 (2019), 756–789. doi: 10.1137/17M1139357.  Google Scholar

[6]

H. Cramér and M. R. Leadbetter, Stationary and Related Stochastic Processes: Sample Function Properties and their Applications, Dover Books on Mathematics, Dover Publications, Inc., Mineola, NY, 2004.  Google Scholar

[7]

S. Da Veiga and A. Marrel, Gaussian process modeling with inequality constraints, Annales de la Faculté Des Sciences de Toulouse Mathématiques, 21 (2012), 529-555.  doi: 10.5802/afst.1344.  Google Scholar

[8]

Y. DengG. Lin and X. Yang, Multifidelity data fusion via gradient-enhanced Gaussian process regression, Communications in Computational Physics, 28 (2020), 1812-1837.  doi: 10.4208/cicp.OA-2020-0151.  Google Scholar

[9]

R. M. Dudley, Real Analysis and Probability, No. 74 in Cambridge Studies in Advanced Mathematics, Cambridge University Press, 2002. doi: 10.1017/CBO9780511755347.  Google Scholar

[10]

A. I. ForresterA. Sóbester and A. J. Keane, Multi-fidelity optimization via surrogate modelling, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 463 (2007), 3251-3269.  doi: 10.1098/rspa.2007.1900.  Google Scholar

[11]

P. Hennig, M. A. Osborne and M. Girolami, Probabilistic numerics and uncertainty in computations, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 471 (2015), 20150142. doi: 10.1098/rspa.2015.0142.  Google Scholar

[12]

M. C. Kennedy and A. O'Hagan, Predicting the output from a complex computer code when fast approximations are available, Biometrika, 87 (2000), 1-13.  doi: 10.1093/biomet/87.1.1.  Google Scholar

[13] P. K. Kitanidis, Introduction to Geostatistics: Applications to Hydrogeology, Cambridge University Press, 1997.  doi: 10.1017/CBO9780511626166.  Google Scholar
[14]

L. Koralov and Y. G. Sinai, Theory of Probability and Random Processes, Universitext, Springer Berlin Heidelberg, 2007. doi: 10.1007/978-3-540-68829-7.  Google Scholar

[15]

L. LaurentR. Le RicheB. Soulier and P.-A. Boucard, An overview of gradient-enhanced metamodels with applications, Archives of Computational Methods in Engineering, 26 (2019), 61-106.  doi: 10.1007/s11831-017-9226-3.  Google Scholar

[16]

L. Le Gratiet and J. Garnier, Recursive co-kriging model for design of computer experiments with multiple levels of fidelity, International Journal for Uncertainty Quantification, 4 (2014), 365-386.  doi: 10.1615/Int.J.UncertaintyQuantification.2014006914.  Google Scholar

[17]

L. Lin and D. B. Dunson, Bayesian monotone regression using Gaussian process projection, Biometrika, 101 (2014), 303-317.  doi: 10.1093/biomet/ast063.  Google Scholar

[18]

H. LiuY.-S. OngX. Shen and J. Cai, When Gaussian process meets big data: A review of scalable GPs, IEEE Transactions on Neural Networks and Learning Systems, 31 (2020), 4405-4423.  doi: 10.1109/TNNLS.2019.2957109.  Google Scholar

[19]

A. F. López-Lopera, F. Bachoc, N. Durrande and O. Roustant, Finite-dimensional Gaussian approximation with linear inequality constraints, SIAM/ASA Journal on Uncertainty Quantification, 6 (2018), 1224–1255. doi: 10.1137/17M1153157.  Google Scholar

[20]

M. D. MorrisT. J. Mitchell and D. Ylvisaker, Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction, Technometrics, 35 (1993), 243-255.  doi: 10.1080/00401706.1993.10485320.  Google Scholar

[21]

A. PensoneaultX. Yang and X. Zhu, Nonnegativity-enforced Gaussian process regression, Theoretical and Applied Mechanics Letters, 10 (2020), 182-187.  doi: 10.1016/j.taml.2020.01.036.  Google Scholar

[22]

P. Perdikaris, M. Raissi, A. Damianou, N. D. Lawrence and G. E. Karniadakis, Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473 (2017), 20160751. doi: 10.1098/rspa.2016.0751.  Google Scholar

[23]

M. RaissiP. Perdikaris and G. E. Karniadakis, Machine learning of linear differential equations using Gaussian processes, Journal of Computational Physics, 348 (2017), 683-693.  doi: 10.1016/j.jcp.2017.07.050.  Google Scholar

[24]

M. Raissi, P. Perdikaris and G. E. Karniadakis, Numerical Gaussian processes for time-dependent and nonlinear partial differential equations, SIAM Journal on Scientific Computing, 40 (2018), A172–A198. doi: 10.1137/17M1120762.  Google Scholar

[25]

C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning, Adaptive computation and machine learning, MIT Press, 2006.  Google Scholar

[26]

J. SacksW. J. WelchT. J. Mitchell and H. P. Wynn, Design and analysis of computer experiments, Statistical Science, 4 (1989), 409-423.   Google Scholar

[27]

M. Schober, D. Duvenaud and P. Hennig, Probabilistic ODE solvers with runge-kutta means, arXiv: 1406.2582, [cs, math, stat], 2014. Google Scholar

[28]

E. Solak, R. Murray-smith, W. E. Leithead, D. J. Leith and C. E. Rasmussen, Derivative observations in Gaussian process models of dynamic systems, in Advances in Neural Information Processing Systems 15 (S. Becker, S. Thrun, and K. Obermayer, eds.), 1057–1064, MIT Press, 2003. Google Scholar

[29]

S. UlaganathanI. CouckuytF. FerrantiE. Laermans and T. Dhaene, Performance study of multi-fidelity gradient enhanced kriging, Structural and Multidisciplinary Optimization, 51 (2015), 1017-1033.  doi: 10.1007/s00158-014-1192-x.  Google Scholar

[30]

X. YangD. Barajas-SolanoG. Tartakovsky and A. M. Tartakovsky, Physics-informed CoKriging: A Gaussian-process-regression-based multifidelity method for data-model convergence, Journal of Computational Physics, 395 (2019), 410-431.  doi: 10.1016/j.jcp.2019.06.041.  Google Scholar

[31]

X. Yang, G. Tartakovsky and A. Tartakovsky, Physics-informed kriging: A physics-informed Gaussian process regression method for data-model convergence, arXiv: 1809.03461, [cs, stat], 2018. Google Scholar

[32]

X. Yang, X. Zhu and J. Li, When bifidelity meets CoKriging: An efficient physics-informed MultiFidelity method, SIAM Journal on Scientific Computing, 42 (2020), A220–A249. doi: 10.1137/18M1231353.  Google Scholar

show all references

References:
[1]

P. Abrahamsen and N. regnesentral, A Review of Gaussian Random Fields and Correlation Functions, Norsk Regnesentral/Norwegian Computing Center, 1997. Google Scholar

[2]

R. J. Adler, The Geometry of Random Fields, Society for Industrial and Applied Mathematics, 2010. doi: 10.1137/1.9780898718980.ch1.  Google Scholar

[3]

O. A. ChkrebtiiD. A. CampbellB. Calderhead and M. A. Girolami, Bayesian solution uncertainty quantification for differential equations, Bayesian Analysis, 11 (2016), 1239-1267.  doi: 10.1214/16-BA1017.  Google Scholar

[4]

H.-S. Chung and J. Alonso, Using gradients to construct cokriging approximation models for high-dimensional design optimization problems, in 40th AIAA Aerospace Sciences Meeting & Exhibit, American Institute of Aeronautics and Astronautics, 2002. doi: 10.2514/6.2002-317.  Google Scholar

[5]

J. Cockayne, C. J. Oates, T. J. Sullivan and M. Girolami, Bayesian probabilistic numerical methods, SIAM Review, 61 (2019), 756–789. doi: 10.1137/17M1139357.  Google Scholar

[6]

H. Cramér and M. R. Leadbetter, Stationary and Related Stochastic Processes: Sample Function Properties and their Applications, Dover Books on Mathematics, Dover Publications, Inc., Mineola, NY, 2004.  Google Scholar

[7]

S. Da Veiga and A. Marrel, Gaussian process modeling with inequality constraints, Annales de la Faculté Des Sciences de Toulouse Mathématiques, 21 (2012), 529-555.  doi: 10.5802/afst.1344.  Google Scholar

[8]

Y. DengG. Lin and X. Yang, Multifidelity data fusion via gradient-enhanced Gaussian process regression, Communications in Computational Physics, 28 (2020), 1812-1837.  doi: 10.4208/cicp.OA-2020-0151.  Google Scholar

[9]

R. M. Dudley, Real Analysis and Probability, No. 74 in Cambridge Studies in Advanced Mathematics, Cambridge University Press, 2002. doi: 10.1017/CBO9780511755347.  Google Scholar

[10]

A. I. ForresterA. Sóbester and A. J. Keane, Multi-fidelity optimization via surrogate modelling, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 463 (2007), 3251-3269.  doi: 10.1098/rspa.2007.1900.  Google Scholar

[11]

P. Hennig, M. A. Osborne and M. Girolami, Probabilistic numerics and uncertainty in computations, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 471 (2015), 20150142. doi: 10.1098/rspa.2015.0142.  Google Scholar

[12]

M. C. Kennedy and A. O'Hagan, Predicting the output from a complex computer code when fast approximations are available, Biometrika, 87 (2000), 1-13.  doi: 10.1093/biomet/87.1.1.  Google Scholar

[13] P. K. Kitanidis, Introduction to Geostatistics: Applications to Hydrogeology, Cambridge University Press, 1997.  doi: 10.1017/CBO9780511626166.  Google Scholar
[14]

L. Koralov and Y. G. Sinai, Theory of Probability and Random Processes, Universitext, Springer Berlin Heidelberg, 2007. doi: 10.1007/978-3-540-68829-7.  Google Scholar

[15]

L. LaurentR. Le RicheB. Soulier and P.-A. Boucard, An overview of gradient-enhanced metamodels with applications, Archives of Computational Methods in Engineering, 26 (2019), 61-106.  doi: 10.1007/s11831-017-9226-3.  Google Scholar

[16]

L. Le Gratiet and J. Garnier, Recursive co-kriging model for design of computer experiments with multiple levels of fidelity, International Journal for Uncertainty Quantification, 4 (2014), 365-386.  doi: 10.1615/Int.J.UncertaintyQuantification.2014006914.  Google Scholar

[17]

L. Lin and D. B. Dunson, Bayesian monotone regression using Gaussian process projection, Biometrika, 101 (2014), 303-317.  doi: 10.1093/biomet/ast063.  Google Scholar

[18]

H. LiuY.-S. OngX. Shen and J. Cai, When Gaussian process meets big data: A review of scalable GPs, IEEE Transactions on Neural Networks and Learning Systems, 31 (2020), 4405-4423.  doi: 10.1109/TNNLS.2019.2957109.  Google Scholar

[19]

A. F. López-Lopera, F. Bachoc, N. Durrande and O. Roustant, Finite-dimensional Gaussian approximation with linear inequality constraints, SIAM/ASA Journal on Uncertainty Quantification, 6 (2018), 1224–1255. doi: 10.1137/17M1153157.  Google Scholar

[20]

M. D. MorrisT. J. Mitchell and D. Ylvisaker, Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction, Technometrics, 35 (1993), 243-255.  doi: 10.1080/00401706.1993.10485320.  Google Scholar

[21]

A. PensoneaultX. Yang and X. Zhu, Nonnegativity-enforced Gaussian process regression, Theoretical and Applied Mechanics Letters, 10 (2020), 182-187.  doi: 10.1016/j.taml.2020.01.036.  Google Scholar

[22]

P. Perdikaris, M. Raissi, A. Damianou, N. D. Lawrence and G. E. Karniadakis, Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473 (2017), 20160751. doi: 10.1098/rspa.2016.0751.  Google Scholar

[23]

M. RaissiP. Perdikaris and G. E. Karniadakis, Machine learning of linear differential equations using Gaussian processes, Journal of Computational Physics, 348 (2017), 683-693.  doi: 10.1016/j.jcp.2017.07.050.  Google Scholar

[24]

M. Raissi, P. Perdikaris and G. E. Karniadakis, Numerical Gaussian processes for time-dependent and nonlinear partial differential equations, SIAM Journal on Scientific Computing, 40 (2018), A172–A198. doi: 10.1137/17M1120762.  Google Scholar

[25]

C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning, Adaptive computation and machine learning, MIT Press, 2006.  Google Scholar

[26]

J. SacksW. J. WelchT. J. Mitchell and H. P. Wynn, Design and analysis of computer experiments, Statistical Science, 4 (1989), 409-423.   Google Scholar

[27]

M. Schober, D. Duvenaud and P. Hennig, Probabilistic ODE solvers with runge-kutta means, arXiv: 1406.2582, [cs, math, stat], 2014. Google Scholar

[28]

E. Solak, R. Murray-smith, W. E. Leithead, D. J. Leith and C. E. Rasmussen, Derivative observations in Gaussian process models of dynamic systems, in Advances in Neural Information Processing Systems 15 (S. Becker, S. Thrun, and K. Obermayer, eds.), 1057–1064, MIT Press, 2003. Google Scholar

[29]

S. UlaganathanI. CouckuytF. FerrantiE. Laermans and T. Dhaene, Performance study of multi-fidelity gradient enhanced kriging, Structural and Multidisciplinary Optimization, 51 (2015), 1017-1033.  doi: 10.1007/s00158-014-1192-x.  Google Scholar

[30]

X. YangD. Barajas-SolanoG. Tartakovsky and A. M. Tartakovsky, Physics-informed CoKriging: A Gaussian-process-regression-based multifidelity method for data-model convergence, Journal of Computational Physics, 395 (2019), 410-431.  doi: 10.1016/j.jcp.2019.06.041.  Google Scholar

[31]

X. Yang, G. Tartakovsky and A. Tartakovsky, Physics-informed kriging: A physics-informed Gaussian process regression method for data-model convergence, arXiv: 1809.03461, [cs, stat], 2018. Google Scholar

[32]

X. Yang, X. Zhu and J. Li, When bifidelity meets CoKriging: An efficient physics-informed MultiFidelity method, SIAM Journal on Scientific Computing, 42 (2020), A220–A249. doi: 10.1137/18M1231353.  Google Scholar

Figure 1.  Graphical illustration of augmented Gaussian random field prediction with measurement noise. There are three layers: input layer, hidden layer, and output layer. The hidden layer is dominated by augmented Gaussian random field. The observable and its derivatives of different orders are integrated into the same field to make predictions
Figure 2.  [Composite function (noiseless)] Prediction of the observable, first order derivative, and second order derivative by AGRF. Case 1: the data include the observable only. Case 2: the data include the observable and first order derivative. Case 3: the data include the observable and second order derivative. Case 4: the data include the observable, first order derivative, and second order derivative. AGRF is able to integrate the observable and derivatives of any order, regardless of the location where they are collected. The AGRF prediction improves when more information is available
Figure 2 for more explanations">Figure 3.  [Composite function (noiseless)] Comparison of the prediction accuracy of AGRF in different cases. See Figure 2 for more explanations
Figure 4.  [Damped harmonic oscillator (noiseless)] Prediction of the displacement, velocity, and phase-space diagram by different methods. GP: the data include the observable and first order derivative; the observable data are used to predict the displacement and the first order derivative data are used to predict the velocity, respectively. GEK: the data include the observable and first order derivative; all the data are used jointly in the same random field to predict the displacement and velocity at the same time. AGRF: the data include the observable, first order derivative, and second order derivative; all the data are used together in the same random field to predict the displacement and velocity at the same time. GEK produces better prediction than GP, while AGRF predicts more accurately than GEK. By using all the available information together in the same random field, we can construct the most accurate surrogate model
Figure 4 for more explanations">Figure 5.  [Damped harmonic oscillator (noiseless)] Comparison of the prediction accuracy by different methods. See Figure 4 for more explanations
Figure 6.  [Korteweg-De Vries equation (noisy)] Top: the solution at $ t = 0.5 $ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by AGRF under different levels of noise. AGRF has good performance even when the noise is as high as 40%. As one might expect, the AGRF prediction is better when the noise is lower
Figure 6 for more explanations">Figure 7.  [Korteweg-De Vries equation (noisy)] Comparison of the prediction accuracy under different levels of noise. See Figure 6 for more explanations
Figure 8.  [Burgers' equation (noisy)] Top: the solution at $ t = 0.5 $ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by different AGRF calibrations. No $ \delta $: noiseless formulation is used despite the presence of noise in the data, i.e., $ \delta_0 = \delta_1 = \delta_2 = 0 $ in Eqn. (87). One $ \delta $: the same noise intensity is used for different order derivatives, i.e., $ \delta_0 = \delta_1 = \delta_2 $ in Eqn. (87). Multiple $ \delta $: different noise intensities are used for different order derivatives, i.e., the same as Eqn. (87). When the noiseless formulation is used despite the presence of noise in the data, overfitting is an issue. When the same noise intensity is used for different order derivatives, the uncertainty in the prediction is incompatible with the data since different order derivatives have different scales. When the formulation is exactly the same as Eqn. (87), AGRF has the best performance
Figure 8 for more explanations. The relative $ L_2 $ errors in the case "no $ \delta $" are greater than $ 1.6 $ and out of bound">Figure 9.  [Burgers' equation (noisy)] Comparison of the prediction accuracy by different AGRF calibrations. See Figure 8 for more explanations. The relative $ L_2 $ errors in the case "no $ \delta $" are greater than $ 1.6 $ and out of bound
[1]

Johnathan M. Bardsley. Gaussian Markov random field priors for inverse problems. Inverse Problems & Imaging, 2013, 7 (2) : 397-416. doi: 10.3934/ipi.2013.7.397

[2]

John Maclean, Elaine T. Spiller. A surrogate-based approach to nonlinear, non-Gaussian joint state-parameter data assimilation. Foundations of Data Science, 2021, 3 (3) : 589-614. doi: 10.3934/fods.2021019

[3]

Rajae Aboulaϊch, Amel Ben Abda, Moez Kallel. Missing boundary data reconstruction via an approximate optimal control. Inverse Problems & Imaging, 2008, 2 (4) : 411-426. doi: 10.3934/ipi.2008.2.411

[4]

Peter Giesl, Boumediene Hamzi, Martin Rasmussen, Kevin Webster. Approximation of Lyapunov functions from noisy data. Journal of Computational Dynamics, 2020, 7 (1) : 57-81. doi: 10.3934/jcd.2020003

[5]

Alfred K. Louis. Diffusion reconstruction from very noisy tomographic data. Inverse Problems & Imaging, 2010, 4 (4) : 675-683. doi: 10.3934/ipi.2010.4.675

[6]

Yan Wang, Lei Wang, Yanxiang Zhao, Aimin Song, Yanping Ma. A stochastic model for microbial fermentation process under Gaussian white noise environment. Numerical Algebra, Control & Optimization, 2015, 5 (4) : 381-392. doi: 10.3934/naco.2015.5.381

[7]

Hongjun Gao, Fei Liang. On the stochastic beam equation driven by a Non-Gaussian Lévy process. Discrete & Continuous Dynamical Systems - B, 2014, 19 (4) : 1027-1045. doi: 10.3934/dcdsb.2014.19.1027

[8]

Yohei Fujishima. On the effect of higher order derivatives of initial data on the blow-up set for a semilinear heat equation. Communications on Pure & Applied Analysis, 2018, 17 (2) : 449-475. doi: 10.3934/cpaa.2018025

[9]

Max-Olivier Hongler. Mean-field games and swarms dynamics in Gaussian and non-Gaussian environments. Journal of Dynamics & Games, 2020, 7 (1) : 1-20. doi: 10.3934/jdg.2020001

[10]

Zhenghong Qiu, Jianhui Huang, Tinghan Xie. Linear-Quadratic-Gaussian mean-field controls of social optima. Mathematical Control & Related Fields, 2021  doi: 10.3934/mcrf.2021047

[11]

Justyna Jarczyk, Witold Jarczyk. Gaussian iterative algorithm and integrated automorphism equation for random means. Discrete & Continuous Dynamical Systems, 2020, 40 (12) : 6837-6844. doi: 10.3934/dcds.2020135

[12]

Yuta Tanoue. Improved Hoeffding inequality for dependent bounded or sub-Gaussian random variables. Probability, Uncertainty and Quantitative Risk, 2021, 6 (1) : 53-60. doi: 10.3934/puqr.2021003

[13]

G. Caginalp, Emre Esenturk. Anisotropic phase field equations of arbitrary order. Discrete & Continuous Dynamical Systems - S, 2011, 4 (2) : 311-350. doi: 10.3934/dcdss.2011.4.311

[14]

Habib Ammari, Josselin Garnier, Vincent Jugnon. Detection, reconstruction, and characterization algorithms from noisy data in multistatic wave imaging. Discrete & Continuous Dynamical Systems - S, 2015, 8 (3) : 389-417. doi: 10.3934/dcdss.2015.8.389

[15]

Xiaoman Liu, Jijun Liu. Image restoration from noisy incomplete frequency data by alternative iteration scheme. Inverse Problems & Imaging, 2020, 14 (4) : 583-606. doi: 10.3934/ipi.2020027

[16]

Xuping Xie, Feng Bao, Thomas Maier, Clayton Webster. Analytic continuation of noisy data using Adams Bashforth residual neural network. Discrete & Continuous Dynamical Systems - S, 2021  doi: 10.3934/dcdss.2021088

[17]

Müge Acar, Refail Kasimbeyli. A polyhedral conic functions based classification method for noisy data. Journal of Industrial & Management Optimization, 2021, 17 (6) : 3493-3508. doi: 10.3934/jimo.2020129

[18]

Bruno Sixou, Cyril Mory. Kullback-Leibler residual and regularization for inverse problems with noisy data and noisy operator. Inverse Problems & Imaging, 2019, 13 (5) : 1113-1137. doi: 10.3934/ipi.2019050

[19]

Guangsheng Wei, Hong-Kun Xu. On the missing bound state data of inverse spectral-scattering problems on the half-line. Inverse Problems & Imaging, 2015, 9 (1) : 239-255. doi: 10.3934/ipi.2015.9.239

[20]

Jiang Xie, Junfu Xu, Celine Nie, Qing Nie. Machine learning of swimming data via wisdom of crowd and regression analysis. Mathematical Biosciences & Engineering, 2017, 14 (2) : 511-527. doi: 10.3934/mbe.2017031

2020 Impact Factor: 2.425

Metrics

  • PDF downloads (112)
  • HTML views (149)
  • Cited by (0)

Other articles
by authors

[Back to Top]