Article Contents
Article Contents

# Augmented Gaussian random field: Theory and computation

• We propose the novel augmented Gaussian random field (AGRF), which is a universal framework incorporating the data of observable and derivatives of any order. Rigorous theory is established. We prove that under certain conditions, the observable and its derivatives of any order are governed by a single Gaussian random field, which is the aforementioned AGRF. As a corollary, the statement "the derivative of a Gaussian process remains a Gaussian process" is validated, since the derivative is represented by a part of the AGRF. Moreover, a computational method corresponding to the universal AGRF framework is constructed. Both noiseless and noisy scenarios are considered. Formulas of the posterior distributions are deduced in a nice closed form. A significant advantage of our computational method is that the universal AGRF framework provides a natural way to incorporate arbitrary order derivatives and deal with missing data. We use four numerical examples to demonstrate the effectiveness of the computational method. The numerical examples are composite function, damped harmonic oscillator, Korteweg-De Vries equation, and Burgers' equation.

Mathematics Subject Classification: 60G15.

 Citation:

• Figure 1.  Graphical illustration of augmented Gaussian random field prediction with measurement noise. There are three layers: input layer, hidden layer, and output layer. The hidden layer is dominated by augmented Gaussian random field. The observable and its derivatives of different orders are integrated into the same field to make predictions

Figure 2.  [Composite function (noiseless)] Prediction of the observable, first order derivative, and second order derivative by AGRF. Case 1: the data include the observable only. Case 2: the data include the observable and first order derivative. Case 3: the data include the observable and second order derivative. Case 4: the data include the observable, first order derivative, and second order derivative. AGRF is able to integrate the observable and derivatives of any order, regardless of the location where they are collected. The AGRF prediction improves when more information is available

Figure 3.  [Composite function (noiseless)] Comparison of the prediction accuracy of AGRF in different cases. See Figure 2 for more explanations

Figure 4.  [Damped harmonic oscillator (noiseless)] Prediction of the displacement, velocity, and phase-space diagram by different methods. GP: the data include the observable and first order derivative; the observable data are used to predict the displacement and the first order derivative data are used to predict the velocity, respectively. GEK: the data include the observable and first order derivative; all the data are used jointly in the same random field to predict the displacement and velocity at the same time. AGRF: the data include the observable, first order derivative, and second order derivative; all the data are used together in the same random field to predict the displacement and velocity at the same time. GEK produces better prediction than GP, while AGRF predicts more accurately than GEK. By using all the available information together in the same random field, we can construct the most accurate surrogate model

Figure 5.  [Damped harmonic oscillator (noiseless)] Comparison of the prediction accuracy by different methods. See Figure 4 for more explanations

Figure 6.  [Korteweg-De Vries equation (noisy)] Top: the solution at $t = 0.5$ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by AGRF under different levels of noise. AGRF has good performance even when the noise is as high as 40%. As one might expect, the AGRF prediction is better when the noise is lower

Figure 7.  [Korteweg-De Vries equation (noisy)] Comparison of the prediction accuracy under different levels of noise. See Figure 6 for more explanations

Figure 8.  [Burgers' equation (noisy)] Top: the solution at $t = 0.5$ is studied. Bottom: prediction of the observable, first order derivative, and second order derivative by different AGRF calibrations. No $\delta$: noiseless formulation is used despite the presence of noise in the data, i.e., $\delta_0 = \delta_1 = \delta_2 = 0$ in Eqn. (87). One $\delta$: the same noise intensity is used for different order derivatives, i.e., $\delta_0 = \delta_1 = \delta_2$ in Eqn. (87). Multiple $\delta$: different noise intensities are used for different order derivatives, i.e., the same as Eqn. (87). When the noiseless formulation is used despite the presence of noise in the data, overfitting is an issue. When the same noise intensity is used for different order derivatives, the uncertainty in the prediction is incompatible with the data since different order derivatives have different scales. When the formulation is exactly the same as Eqn. (87), AGRF has the best performance

Figure 9.  [Burgers' equation (noisy)] Comparison of the prediction accuracy by different AGRF calibrations. See Figure 8 for more explanations. The relative $L_2$ errors in the case "no $\delta$" are greater than $1.6$ and out of bound

Figures(9)