CONVERGENCE AND CENTER MANIFOLDS FOR DIFFERENTIAL EQUATIONS DRIVEN BY COLORED NOISE

. In this paper, we study the convergence and pathwise dynamics of random diﬀerential equations driven by colored noise. We ﬁrst show that the solutions of the random diﬀerential equations driven by colored noise with a nonlinear diﬀusion term uniformly converge in mean square to the solu- tions of the corresponding Stratonovich stochastic diﬀerential equation as the correlation time of colored noise approaches zero. Then, we construct ran- dom center manifolds for such random diﬀerential equations and prove that these manifolds converge to the random center manifolds of the correspond- ing Stratonovich equation when the noise is linear and multiplicative as the correlation time approaches zero.

(Communicated by Rafael de la Llave) Abstract. In this paper, we study the convergence and pathwise dynamics of random differential equations driven by colored noise. We first show that the solutions of the random differential equations driven by colored noise with a nonlinear diffusion term uniformly converge in mean square to the solutions of the corresponding Stratonovich stochastic differential equation as the correlation time of colored noise approaches zero. Then, we construct random center manifolds for such random differential equations and prove that these manifolds converge to the random center manifolds of the corresponding Stratonovich equation when the noise is linear and multiplicative as the correlation time approaches zero.

1.
Introduction. This paper is concerned with the convergence and pathwise dynamics of the following random differential equation on R n driven by colored noise: where f : R n → R n and σ : R n → R n×l are nonlinear functions, and z δ (θ t ω) is an l-dimensional colored noise with correlation time δ > 0.
It is known that the probability measure P is an ergodic invariant measure for θ t .
For each δ > 0, we consider the following stochastic differential equation on R l : This equation has a unique stationary solution given by which is called an Ornstein-Uhlenbeck process or colored noise. For convenience, we denote by Then we find that For later purpose, we write The colored noise has been widely used in physics and biology to study the dynamical behavior of solutions of random systems, see, e.g., [10,20,23,43,36,45] and the references therein. Recently, the attractors and invariant manifolds of random differential equations driven by additive or linear multiplicative colored noise have been studied in [11,12] and [18,19], respectively.
A basic question is: Do the solutions and dynamics of the random equation (1) converge to those of the corresponding Stratonovich stochastic differential equation In this paper, we try to answer this question. We first show that the solution u δ of equation (1) indeed converges uniformly in mean square to a solutions of equation (6) as the correlation time δ approaches zero. Then, we construct the random center manifolds for equation (1) and prove that the center manifolds of the equations driven by colored noise converges to the center manifolds of the corresponding Stratonovich equations when the noise is linear and multiplicative as the correlation time approaches zero.
More precisely, we state our results as follows. The first result is about the mean square convergence of the solutions of the random differential equation driven by colored noise.
Main Theorem 1. Let u δ (t, ω, x) and u(t, ω, x) be the solutions of equations (1) and (6) with initial data x at t = 0, respectively. Assume that f i ∈ C 1 b (R n ) and σ ij ∈ C 2 b (R n ) for all i = 1, · · · , n and j = 1, · · · , l. Then, for every T > 0 we have where C k b (R n ) is the usual space of C k smooth functions from R n to R with bounded derivatives up to order k ∈ N.
The second result is on the existence of center-manifolds at stationary solution u = 0 for the random differential equatioṅ for each δ > 0, where A is a partially hyperbolic n×n matrix, f and σ are nonlinear terms, and z δ (θ t ω) is l-dimensional colored noise. Assume that f and σ satisfy f (0) = 0, σ(0) = 0 and that there exist constants M 0 , R 0 > 0 and 0 ∈ (0, 1] such that Main Theorem 2. Assume that A is a partially hyperbolic n × n matrix and f and σ satisfy (8) with f (0) = 0, σ(0) = 0. Then, equation (7) has a local Lipschitz center-manifold.
Remark 1. Replacing the colored noise by a l-dimensional white noise W (t) in equation (7), we consider the corresponding Stratonovich stochastic differential equation du = (Au + f (u))dt + σ(u) • dW. (9) By the theory due to Kunita, etc, this equation generates a random dynamical system. However, it is not known how to establish the existence of its center manifolds.
The last result is on the convergence of the center manifolds of the random differential equations with linear multiplicative noise. More precisely, we consider the random differential equation driven by a linear multiplicative colored noisė u δ = Au δ + f (u δ ) + u δ z δ (θ t ω), (10) and the corresponding Stratonovich stochastic differential equation 2. Convergence of solutions of RDEs driven by colored noise. In this section, we consider the stochastic differential equation and its corresponding equation driven by colored noise of the forṁ where W (t) = (W 1 (t), . . . , W l (t)) is a two-sided Brownian motion and z δ (θ t ω) is the colored noise given by (4). Throughout this section, we assume that f and σ are Lipschitz continuous functions, i.e., there is a constant L > 0 such that for all For the classical Wiener space (Ω, F, P), from the law of logarithms, it follows that there exists a θ t -invariant subsetΩ of Ω of full measure with sublinear growth: where Q is the set of rational numbers. By the pathwise continuity of the Wiener process, we find that C ω :Ω → R + is a measurable function and for all s ∈ R. Recall that θ t ω(s) = ω(s + t) − ω(t), it then follows that where K δ := 2 δ (δ + 1). This estimate plays a key role in the proof of well-posedness of equation (13).
Consider the space (Ω,F,P) whereF = {Ω∩A, A ∈ F} andP is the restriction of P toF. We will restrict our study in (Ω,F,P), which is again denoted by (Ω, F, P). Let N be the collection of all null sets of (Ω, F, P). Given t ∈ R, denoted by is a filtered dynamical system. Proposition 1. Assume (14) holds. Then, for each δ > 0 we have the following: (i) equation (13) has a unique solution u δ (t, ω, x) defined for all 0 ≤ t < +∞; (v) u δ (t, ω, x) generates a random dynamical system.
Since the proof of this proposition follows from the standard arguments, we omit it. Let Then random differential equation (13) can be written aṡ For any T > 0, in what follows, we shall show that the solutions of equation (16) converge in mean square to the solutions of equation (12) uniformly on [0, T ] as δ → 0 + . From now on, we use K to denote a generic positive constant whose value may change from line to line, but does not depend on δ.
We first use the Burkholder-Davis-Gundy inequality to obtain the following estimates.
Lemma 2.1. Let W 0 be a two-sided real-valued Brownian motion and f : R − → R be a function such that 0 −∞ |f (τ )| 2 dτ < ∞. Then for arbitrary p > 0 we have Applying the Itô isometry to the right-hand side of the above identity, we get which along with (17) implies On the other hand, by the Burkholder-Davis-Gundy inequality, for any u > 0 we have Therefore, we obtain , as desired.
The following lemma is a summary of basic properties of the approximations of a Brownian motion.
The next lemma is on the moments of the approximations.
Proof. Applying Fubini's theorem, we first note that By the independence of dW i (τ ) and the Itô isometry, we get Similarly, we find that as desired.
For Π 1 ( t (δ), t), using the Hölder inequality, we have E sup By Lemma 2.2 (1) and (5) and the invariance of probability measure P, we have Thus, from (23), we have For Π 2 ( t (δ), t), we split it into two parts.
where the third inequality follows from martingale inequality and the fourth one follows from ω j ((k + 1)δ) − ω j (kδ) has the same distribution as ω j (δ) and the Brownian scaling property.
Here the martingale inequality is used to get the last estimate. Then, using the Itô isometry, we obtain ds.
Since f ∈ C 1 b and σ ∈ C 2 b , by (21) we have that for t > s Hence, Since σ has bounded derivatives, we have Combining (22) and (24) This completes the proof of the lemma. Since Lemma 2.7. The following holds Finally, we estimate Π(δ, t (δ)). Recall from (22) that We first rewrite Π 1 (δ, t (δ)) and use integration by parts to have From now on, we always assume that δ 0.9 ≤δ since n(δ) 10 δ → +∞ as δ → 0 + . Note that We then write Π 11 (δ, t (δ)) as a sum of three parts.
On the one hand, by Fubini's theorem, we find that This implies that we can split Π 11 (δ, t (δ)) into Then Π 11 (δ, t (δ)) can be rewritten as Hence, we can write Π(δ, t (δ)) as where Next, we estimate each term on the right hand of equation (27), respectively. We first summarize them as follows.
Lemma 2.8. Let T > 0 be a fixed constant. Then we have the following estimates as δ → 0 + : E sup (33) Proof. We prove the statements one by one.
(I) Estimate of Υ j 1 (δ, t (δ)) By using the martingale inequality and the Itô isometry, we obtain From (20), we have Then, changing variable r to r + s (δ) − δ 0.9 in the integral, and using Lemma 2.2 (1) and (5) and the θ t -invariance of P, we have This completes the proof of property (28).
Recall that Using the Cauchy inequality, we have E sup From (20), we get Thus, by using Lemma 2.2 (1) and the θ t -invariance of P, we have that Then, changing variable s to s + kδ − δ 0.9 and using Lemma 2.2 (1) and the θ tinvariance of P, we have that where in the second inequality Lemma 2.3 is used, and the fact m(T )δ ≤ T ,δ = n(δ)δ and for d = 1, . . . , l are also used. This completes the proof of property (29).
Note that In order to estimate Π 112 (δ, t (δ)), we set Hence, by the martingale inequality and Itô isometry we obtain where the first equality follows from the fact for This completes the proof of property (30).

JUN SHEN, KENING LU AND BIXIANG WANG
It is well-known that for each 0 < α < β, there is a K ≥ 1 such that Furthermore, we assume that there exist constants M 0 , R 0 > 0 and 0 ∈ (0, 1] such that By (15), we observe that We note that C ω is tempered from above and C θtω is locally integrable in t.
Let ρ : Ω → (0, +∞) be a random variable tempered from below such that ρ(θ t ω) is locally integrable in t. We consider a modification of F (ω, u). Let
Clearly, its solutions also generate random dynamical system. Fixing α, we define the following Banach space for γ = α+β 2 , Then for each δ > 0 we define is not empty. In the following, we will prove that M c δ (ω) is a Lipschitz manifold given by a graph of a Lipschitz function and is invariant under the random dynamical systems generated by equation (40). The next lemma gives a description of set M c δ (ω) by using an integral equation.

4.
Convergence of center manifolds of random equations. In this section, we establish the convergence of the center manifolds of the random differential equation driven by colored noisė More precisely, we will show than its center manifolds converge as δ → 0 to the center manifold of the stochastic differential equation Here, we assume that A is an n × n matrix with zero real parts of eigenvalues, f is a globally Lipschitz continuous function with f (0) = 0, and z δ (θ t ω) and W (t, ω) are 1-dimensional noises. First, we recall the following lemma from [11]. Next, we consider a linear stochastic differential equation: For this equation, we borrow the following results from [8]: (1) For ω ∈ Ω the random variable exists and generates a unique stationary solution of (52) given by The mapping t → z(θ t ω) is continuous.
(2) In particular, on Ω we have For each δ > 0, replacing the white noise in equation (52) by z δ (θ t ω) we geṫ Let z(t, ω, x) and z δ (t, ω, x) be the solutions of equations (52) and (53) with the initial data x at t = 0, respectively. The following Lemma shows that z δ (t, ω, x) is an uniform approximation of z(t, ω, x). Proof. We first choose a positive constant T such that [T 1 , T 2 ] ⊂ [−T, T ]. Note that z δ (t, ω, x) and z(t, ω, x) satisfy Therefore, we get By Gronwall's inequality we obtain  (1) For each δ > 0, the random variable exists and generates a stationary solution of (53) given by The mapping t → z * δ (θ t ω) is continuous.
Then we get for all t ∈ R, By (57), we have where T * is a positive number to be specified later. For the first integral on the right hand side, we have We first choose a sufficiently large T * > 0 such that 8 −T * −∞ e r dr < ε and note that there exists a T 4 = T (ω, ε) > 0 such that |t| > T 4 4 |t| 0 −∞ e r (M (ω) + |r|)dr < ε.
Remark. (1) The center manifolds obtained in Theorem 4.6 are C 1 smooth if f ∈ C 1 . (2) Under certain conditions, similar results as Theorem 4.6 also hold for stable and unstable manifolds.
(3) If f is an arbitrary C 1 function with f (0) = 0 and Df (0) = 0, one can use the standard procedure to modify f by using a smooth cut-off function such that the modified function is globally Lipschitz continuous with a desired small Lipschitz constant. Thus, applying the results obtained here, one can get the convergence of local center-manifolds of the equations driven by colored noise.