Optimal control problems for some ordinary differential equations with behavior of blowup or quenching

This paper is concerned with some optimal control problems for equations with blowup or quenching property. We first study the existence and Pontryagin's maximum principle for optimal controls which have the minimal energy among all the controls whose corresponding solutions blow up at the right-hand time end-point of a given functional. Then, the same problem for quenching case is discussed. Finally, we establish Pontryagin's maximum principle for optimal controls of extended problems after quenching.


1.
Introduction. It is well known that solutions to some evolution equations have the behavior of blowup or quenching. Such equations can describe a large class of phenomena in applied science. For instance, equations with the property of blowup can describe the dramatic increase in temperature leading to the ignition of a chemical reaction, and those with the property of quenching can be used to represent the potential differences of the polarization field in ionic conductors achieving the balance. Roughly speaking, blowup is a conception which means that a solution is unbounded in finite time. Quenching of a solution means that the derivative in time t of the solution goes to infinity in finite time, while it keeps bounded itself. In the past decades, such equations have attracted many researchers' attention (see, for instance, [2], [3], [5]- [11]). In recent years, optimal control problems for minimal or maximal blowup/quenching time have been studied by several authors (see [4], [13]- [16], [18]- [20]).
A natural interesting problem is to find the control minimizing a given cost functional among all the controls whose corresponding solutions blow up or quench at the right-hand time end-point of the functional. It is one purpose of this paper. Optimal control problems for some partial differential equations with the property of blowup have been studied in few papers (see, for instance, [1], [17]). However, the intention of these papers was to find an optimal control with the minimal energy from all the controls whose corresponding solutions exist in the closed interval where the cost functional is defined. On the other hand, Lin [14] studied the extendability and an optimal control problem after quenching for some ordinary differential equations. It was shown in [14] that a solution, which quenches for the first time at finite time, may hold different properties: not extendable after quenching, extendable uniquely or having at least two extended solutions. The extended solutions may quench for the second time. Thus, we can also consider the following optimal control problem: for a given extendable solution which quenches for the first time at finite time, one seeks the control with minimal energy among all the extended controls that can make the corresponding extended solution quenches for the second time at the right-hand time end-point of a cost functional. This is another purpose of the paper.
Quenching problem for ordinary differential equations is more complex than blowup one. If the set of quenching points for an ordinary differential equation contains only one point (or finite isolated points), then theoretically, quenching problem could be converted to blowup one by suitable variable substitution. However, if there exists a quenching point which has a neighborhood holding infinite quenching points, quenching problem may in general not be transformed into blowup one. Furthermore, a solution for an ordinary differential equation blowing up means that the Euclidean norm of it goes to infinity at finite time. We need not consider the behavior of this solution after blowup. While, as we have stated above, the solution may be extendable after quenching.
In this paper, we consider the following controlled differential equation: In the above, y 0 is an initial data in R n and f (·, ·, ·) : [0, +∞) × R n × U → R n is a given map, where U is a nonempty set in R m . y(·) is called the state trajectory taking values in R n , and u(·) is called the control taking values in U . We say y(·) ∈ C([0, T ); R n ) is a solution of (1) on [0, T ), if y(·) satisfies We shall first study the existence and Pontryagin's maximum principle for optimal controls which have the minimal energy among all the controls whose corresponding solutions blow up at the right-hand time end-point of a given functional. Then, the same problem for quenching case will be discussed. Finally, among all the extended controls whose corresponding extended solutions quench for the second time at the right-hand time end-point of a cost functional, Pontryagin's maximum principle will be established for optimal controls which have the minimal energy. It is different with the problem considered in [14], where every admissive state does not quench for the second time at any point of the closed interval in which the cost functional is defined.
As far as we know, since equations with blowup or quenching property will show singularities at blowup or quenching time, the key to getting Pontryagin's maximum principle for optimal control problems related to blowup or quenching is to obtain so-called "the initial period optimality". For instance, in order to obtain Pontryagin's maximum principle, Lin and Wang [15] pointed out the fact in the proof of OPTIMAL CONTROL PROBLEMS WITH BLOWUP OR QUENCHING 811 Corollary 1.3 that the optimal controls of the minimal blowup time hold the properties of both the initial period optimality and the terminal period optimality for a special autonomous system; for general controlled autonomous systems, Lou and Wang [19] established the initial period optimality for the optimal controls of the minimal and the maximal blowup/quenching time; meanwhile, for non-autonomous case with blowup property, Lou and Wang [18] introduced a class of monotone systems and deduced the initial period optimality based on the monotonicity. However, for non-autonomous and non-monotone systems with blowup property, and for non-autonomous systems with quenching property and those having not only one quenching point (even if they are monotone), "the initial period optimality" may not hold for blowup/quenching time optimal control problems. Fortunately, we can establish "the initial period optimality" for our optimal control problems and by means of the results of classical optimal control problems with terminal constraints, the corresponding Pontryagin's maximum principle for general systems without autonomous and monotone assumptions could be established in this paper.
The rest of the paper is organized as follows. In Section 2, we will obtain the existence and Pontryagin's maximum principle for optimal controls of blowup case. Section 3 concerns the quenching case. Then, we establish Pontryagin's maximum principle for optimal controls of extended problems after quenching in Section 4. Finally, we state further remarks on some open problems in the last section.
Then, we study the following optimal control problem.
Lemma 2.1. (See [21]) Suppose that U ⊆ R m is compact. Then, for any T > 0, the relaxed control set R T (U ) is convex and sequentially compact.
(A1) The set U ⊆ R m is compact.
To make preparations for the existence result of Problem (RPB), we need the following lemma.
then it holds that where L M +1 is defined in (A3). And (5) implies that for some K ε > 0, where Combining (7) with (8) and (9), we have Further, for each k ≥ K ε , let It remains to prove t k ε = T . Obviously, t k ε > 0. By contradiction, we suppose that t k ε < T . Then, by (A3) and (10), it follows that By Grönwall's inequality, we have which contradicts to the definition of t k ε . Thus, t k ε = T . Combining with the continuity of y(·) and y k (·) for all k ∈ N, we deduce that (6) holds.
We have completed the proof.
By Lemma 2.2, we can obtain the following corollary.
Proof. Suppose that y(·) does not blow up at T 1 . There are two cases to discuss.
Case One. y(·) exists globally or blows up at some finite time S > T 1 .
In this case, by repeating the same strategies in Lemma 2.2, we have that y k (·) exists on [0, T 1 ] when k is large enough, which contradicts to the fact that {(y k (·), σ k (·))} +∞ k=1 ⊆ RP B ad . Case Two. y(·) blows up at some finite time S < T 1 .
By (A4), there exists an where R 0 is defined in (A4). On the other hand, since y(·) blows up at S, we easily obtain a T ∈ (0, S), such that By the similar argument in the proof of Lemma 2.2, we can find a K > 0 such that Then, by (A4), we can use the similar argument in the proof of Lemma 2.3 in [15] to obtain Then, it follows by (A4) that Combining (11), (12) with (13), we obtain that for any k ≥ K, which is a contradiction to the fact that S < T 1 . Thus, we have proved that y(·) blows up at T 1 .
By (18), we can use Filippov's Lemma to get that there exists a u * (·) ∈ U T1 , such that which ensures that (y * (·), u * (·)) is an optimal pair of Problem (PB). Now, we aim to obtain Pontryagin's maximum principles of Problem (PB) by establishing the initial period optimality. For this reason, we introduce a class of auxiliary problems. Precisely, for any T ∈ (0, T 1 ), denote where (y * (·), u * (·)) is an optimal pair of Problem (PB).
Define the optimal control problems as follows.

OPTIMAL CONTROL PROBLEMS WITH BLOWUP OR QUENCHING 817
The following lemma shows the relation between Problem (PB T ) and Problem (PB).
On the basis of Lemma 2.5, we can establish Pontryagin's maximum principle of Problem (PB). We need an extra assumption.
3. Quenching case. In this section, we aim to establish the existence and Pontryagin's maximum principles for optimal control problems governed by the systems with quenching phenomenon. Precisely, for a given nonempty set Ω, we consider the case that the function f in (1) may be unbounded near Ω. For this reason, we need similar assumptions with the blowup case. We restate these assumptions as follows except (A1), (A5) and (A6).
(H5) There is a constant r > 0, such that for any ϕ ∈ R n , Remark 3. For convenience, we assume the set Ω is convex in (H1) to ensure the differentiability of d Ω (·) on (0, +∞) in (H4). In the following part of the paper, we only need the differentiability of d Ω (·).

PING LIN AND WEIHAN WANG
Similarly, we study the following optimal control problem.
By the similar argument in the proof of Lemma 2.2, we can find a K > 0 such that Then, by (H4), we can use the similar techniques in the proof of Lemma 2.3 in [15] to get d Ω (y k (t)) ≤ r 1 , ∀ t ∈ [T, S), k ≥ K.
(36) Hence, it follows by (H4) that Combining (35), (36) with (37), we obtain that for any k ≥ K, which is a contradiction to the fact that S < T 1 .

PING LIN AND WEIHAN WANG
Thus, we have proved this corollary.
Then, we can easily deduce the existence of Problem (RPQ).  Further, we introduce the initial period optimality of Problem (PQ). Auxiliary optimal control problems are considered. Precisely, for any T ∈ (0, T 1 ), denote where (y * * (·), u * * (·)) is an optimal pair of Problem (PQ). Define the auxiliary optimal control problems as follows.
On the basis of the initial period optimality, Pontryagin's maximum principle of Problem (PQ) can be stated as follows.
Remark 6. The existence result and Pontryagin's maximum principle of Problem (PQ) can be proved by adopting the same strategies of Problem (PB). Thus, we omit the details of the proof.

4.
Case of extendable control problem after quenching. In this section, we focus on the optimal control problems governed by an extendable equation after quenching. For convenience, denote where I is an arbitrary subset of R. Then, we give the following definition (see [14]) on the extendability of a solution of (1) after quenching.