SOLVING MALFATTI’S HIGH DIMENSIONAL PROBLEM BY GLOBAL OPTIMIZATION

. We generalize Malfatti’s problem which dates back to 200 years ago as a global optimization problem in a high dimensional space. The problem has been formulated as the convex maximization problem over a nonconvex set. Global optimality condition by Strekalovsky[11] has been applied to this problem. For solving numerically Malfatti’s problem, we propose the algorithm in [3] which converges globally. Some computational results are provided.


1.
Introduction. In 1803 Italian mathematician Malfatti posed the following problem: how to pack three non-overlapping circles of maximum total area in a given triangle ? Malfatti originally assumed that the solution to this problem are three circles inscribed in a triangle such that each circle tangent to other two and touches two sides of the triangle. Now it is well known that Malfatti's solution is not optimal.There are works devoted to solving Malfatti's problem [13]- [6], [5]- [8].The most common methods used for finding the best solutions to Malfatti's problem were algebraic and geometric approaches. In 1994 Zalgaller and Los [14], [6] showed that the greedy arrangement is the best one. Based on trigonometric equations and inequalities, using so called rigid systems they did attempt to find the best solution to Malfatti's problem. Most recently, a new approach based on global optimality conditions by Strekalovsky [11] for solving Malfatti's problem was proposed in [2]. The paper is organized as follows. In section 2, we formulate Malfatti's problem as the convex maximization problem. Global optimality conditions for Malfatti's problem in a high dimensional space are given in Section 3. In section 4 computational results are provided.

2.
Malfatti's Problem and Convex Maximization. In order to generalize Malfatti's problem as an optimization problem for high dimensional case, we need to do the following steps.
First, we equivalently formulate the problem in terms of convex sets such as a ball and a polyhedral set. Secondly, we characterize inscribed conditions which mean that the balls lie inside a polyhedral set. For this purpose, we introduce the following sets. Denote by B(x, z) a ball with a center x ∈ R n and a radius z ∈ R : A bounded and closed polyhedral set D ⊂ R n is given by here , denotes the scalar product of two vectors in R n , · is Euclidean norm, and int D = ∅. Proof.
Necessity. Let y ∈ B(x, z) and y ∈ D. A point y ∈ B(x, z) can be easily presented as y = x + zh, h ∈ R n , h ≤ 1. Condition y ∈ D follows that a i , y ≤ b i , i = 1, 2, . . . , m or equivalently, a i , x + z a i , h ≤ b i , i = 1, 2, . . . , m, ∀h ∈ R n . Hence, we have Sufficiency. Let condition (3) be held and on the contrary, assume that there existsỹ ∈ B(x, z) such thatỹ ∈ D. Clearly, there existsh ∈ R n so thatỹ = x + zh, h ≤ 1. Sinceỹ ∈ D, there exists j ∈ {1, 2, . . . , m} for which a j ,ỹ > b j or a j , On the other hand, using Cauchy-Schwarz-Bunyakovsky inequality we have a j , x + z a j > b j which contradicts (3). Now we formulate inscribed conditions of three balls into a polyhedral set. There are 3 main cases: Case 1. Three balls are mutually tangent to each other. Case 2. One of the balls is tangent to other two and the centers of the balls lie on the same line. Case 3. One of the balls is tangent to other two but their centers don't lie on the same line. At the same time, the last two balls don't intersect with each other.
3. Global optimality conditions and algorithm. In previous section, we note that the solution to Malfatti's problem is Let us consider again these problems Problems (14) belong to a class of concave programming or equivalently, convex maximization problem.
Global Optimality conditions for the convex maximization problem first formulated by Strekalovsky, A.S. in 1987 [11]. Now we apply this result to problem (14) which is the following: Theorem 3.1. [11] Let z ∈ S i satisfy f (z) = 0. Then z is a solution to problem (14) if and only if f (y), x − y ≤ 0 for all y ∈ E f (z) (f ) and x ∈ S i , where E c (f ) = {y ∈ R n |f (y) = c} is the level set of f at c and f (y) is the gradient of f at y.
Before presenting an algorithm for solving problem (14) it is useful to restate Theorem 3.1 in a convenient way via the function Θ(z) defined for z ∈ S i : It has been shown in [3] that the function Π(y) is continuous and directionally differentiable. Since f is strongly convex, the set E f (z) (f ) is compact. Thus, Θ(z) < +∞, We note that Π(y) ≤ Θ(z) for all y ∈ E f (z) (f ). Proof. follows from the following inequalities: which hold for all x ∈ S i and y ∈ E f (z) (f ). Now we apply the Algorithm MAX in [3] to solve problem (14) numerically.

Algorithm MAX
Step 1. Choose a point x 0 ∈ S i found as a local maximizer to problem (4) − (11) by the interior point method. Assume that f (x 0 ) = 0. Set k := 0.
Step 2. Solve the problem max Let y k be a solution to this problem, i.e. Π(y k ) = max Let Θ(x k ) := Π(y k ), and let x k+1 be a solution satisfying Π(y k ) = f (y k ), x k+1 − y k .
Step 3. If Θ(x k ) = 0 then stop, and x k is a global solution. Otherwise, set k = k + 1 and return to Step 2.
The convergence of the Algorithm is given by the following theorem.

Theorem 3.3. [3]
The sequence{x k , k = 1, 2, . . . } generated by Algorithm MAX is a maximizing sequence for problem (14),that is,   1, 1, 0), B(4, 1, 0), C(3, 3, 0) and D(5, 4, 3) has been considered. As we can see in Section 2 that solving Malfatti's problem consisted of three main cases. Then this problem is the following for Case 1: For Case 2, we replace 15th constraint in (16) with the following constraint: Also, for Case 3 instead of 15th constraint in (16), we have following constraint: In the computational experiment, from view point of geometric triviality , we did not consider cases where at least one ball is not tangent to other two.
The performance of the proposed algorithm was tested on three cases of Malfatti's problem. The programming code for the algorithm was written in Matlab and run on a computer Pentium Core 2. The results are given for each case in Table 1.  to the convex maximization problem with 12 variables. The global optimality conditions [11] as well as a global search algorithm in [3] have been applied to the problem. Computational experiment was done for a given tetrahedron. This approach can be easily extended to a number of balls n ≥ 4. But it will be discussed in a next paper. Clearly, there exists a relationship between Malfatti's problem and Tverberg theorem [12] in terms of partitions of a set of points but it has not been considered in our paper.