AN EFFECTIVE HYBRID FIREFLY ALGORITHM WITH THE CUCKOO SEARCH FOR ENGINEERING OPTIMIZATION PROBLEMS

. Fireﬂy and cuckoo search algorithms are two of the most widely used nature-inspired algorithms due to their simplicity and inexpensive com- putational cost when they applied to solve a wide range of problems. In this article, a new hybrid algorithm is suggested by combining the ﬁreﬂy algorithm and the cuckoo search algorithm to solve constrained optimization problems (COPs) and real-world engineering optimization problems. The proposed al- gorithm is called Hybrid FireFly Algorithm and Cuckoo Search (HFFACS) algorithm. In the HFFACS algorithm, a balance between the exploration and the exploitation processes is considered. The main drawback of the ﬁreﬂy algorithm is it is easy to fall into stagnation when the new solution is not bet- ter than its previous best solution for several generations. In order to avoid this problem, the cuckoo search with L`evy ﬂight is invoked to force the ﬁre- ﬂy algorithm to escape from stagnation and to avoid premature convergence. The proposed algorithm is applied to six benchmark constrained optimization problems and ﬁve engineering optimization problems and compared against four algorithms to investigate its performance. The numerical experimental results show the proposed algorithm is a promising algorithm and can obtain the optimal or near optimal solution within a reasonable time.


1.
Introduction. Meta-heuristics are global search methods that cover all heuristics methods, which give evidence of getting good solutions for the problem of interest within a reasonable time. Meta-heuristics structures are mainly based on simulating nature and artificial intelligence tools. The main two components of meta-heuristic methods are exploration (diversification) process and exploitation process (intensification). In exploring process, the search space is explored to avoid trapping in local minima, while in exploitation process the best-obtained solution is improved. Meta-heuristics can be categorized into two classes; population-based methods and point-to-point based methods. The population-based methods include Ant Colony Optimization (ACO) [7], Genetic Algorithms (GAs) [15], Scatter Search (SS) [12], etc, while the point-to-point based methods include Tabu Search (TS) [11], Simulated Annealing (SA) [18], Variable Neighborhood Search (VNS) [26], [27], Iterated Local Search (ILS) [35]. In the last two decades, new meta-heuristics methods have been inspired from nature. These methods are called natural-inspired methods or swarm intelligence algorithms, such as Artificial Bee Colony (ABC) [16], Particle Swarm Optimization (PSO) [17], Bacterial foraging [31], Bat Algorithm (BA) [46], Wolf search [36], Cat swarm [6], Cuckoo Search [45], Firefly Algorithm (FA) [46], Fish swarm/school [19], etc. These methods suffer from slow convergence, and they can fall into stagnation when no particle discovers a position that is better than its previous best position for several generations.
In order to circumvent the drawbacks of each metaheuristic algorithm, several researchers have tried to combine two metaheuristic algorithms (known as hybridization) or integrate a metaheuristic algorithm with a local search algorithm (known as memetic). Hybridization alludes to the procedure of integrating the best aspects of two or more algorithms to structure a good-attribute algorithm. One of the most vastly employed strategies to suggest hybrid metaheuristics is by integrating metaheuristics with local search algorithms. This integration is called memetic algorithms. We refer to the Interested readers to [29] for a good review on memetic algorithms. Another approach to frame hybrid metaheuristics is by integrating two or more metaheuristics [4]; we follow this approach in this work. Hybrid metaheuristics can outperform their ingredient algorithms due to their integration [30]. This feature is the main motivation for the study of hybrid metaheuristics. Many hybrid metaheuristics have been developed to enhance performance and to get global optima [24], [28]. Although some of these enhancements and developments are important, the new hybrid metaheuristic strategies need more investigations. Recent research directions in hybrid metaheuristics engage various directions. For example, the first direction is the application of hybrid metaheuristics to various kinds of optimization problems, such as unconstrained optimization problems [1], constrained optimization [3], [43], integer programming, and minimax problems [2]. The second direction is the application of hybrid metaheuristics to real-life applications [20], [21]. The third direction is to integrate metaheuristics to a hybrid algorithm [22], [4]. The fourth direction is how to hybridize a given set of metaheuristics into a single algorithm [34], [10].
The objective of this paper is to propose a new hybrid algorithm by combining two of the most promising swarm intelligence algorithms, which are called firefly algorithm and cuckoo search algorithm. The proposed algorithm is called Hybrid FireFly Algorithm and Cuckoo Search (HFFACS) algorithm. In the HFFACS algorithm, the balance between the exploration and the exploitation process is considered to overcome the slow convergence of the standard firefly algorithm. The standard firefly algorithm has a good ability to perform exploration and exploitation search. However, it is easy to fall into stagnation when the new solutions are not better than the current best solution. The cuckoo search algorithm and the Lèvy flight with their capability of performing a wide exploration search in the search space are invoked to help the firefly algorithm to jump out of stagnation and to avoid the premature convergence. In order to investigate the proposed algorithm, HFFACS is tested on six benchmark constrained optimization problems and five engineering optimization problems and compared against four algorithms. The computational results indicate that the proposed algorithm is a promising algorithm and can get the optimal or near optimal solution within an acceptable time.
The rest of this paper is organized as follows. The definition of the constrained optimization problem is presented in Section 2, The main concepts of the firefly algorithm and cuckoo search algorithm are discussed in Sections 3, 4 respectively. The proposed algorithm is shown in detail in Section 5. In Section 6, the experimental results and all the comparative experiments results between the proposed algorithm and the other algorithms are reported. The description of the engineering optimization problems and the experimental results of the proposed algorithm with these problems are presented in Section 7. Finally, the conclusion and future work make up Section 8.

2.
Constrained optimization problem. The constrained optimization problems and constraint handling are one of the most challenging in many applications. A general form of a constrained optimization is defined as follows.
Where f (x) is the objective function, x is the vector of n variables, g i (x) are inequality constraints, h j (x) = 0 are equality constraints, x l , x u are variables bounds.
3. Overview of firefly algorithm. In the following subsections, the main concepts and characteristics of the firefly algorithm are highlighted as follows.
3.1. Main concepts. The firefly algorithm (FA) is a nature-inspired populationbased algorithm, proposed by Xin-She Yang in late 2007 and 2008 [44], [45]. According to many recent publications, FA is a promising algorithm and outperforms other meta-heuristics algorithms such as the algorithms in [23], [46]. FA has three flashing characteristics and idealized rules, which are inspired by the real fireflies. These rules can be summarized as follows: 1. All fireflies are unisex, and they will move to other fireflies regardless of their sex. 2. The attractiveness of the firefly is proportional to its brightness, and it decreases as the distance from the other firefly increases. The firefly with less bright will move towards the brighter one. The firefly will move randomly if there is no brighter firefly than a particular one. 3. The brightness of a firefly is determined by the value of the objective function.

3.2.
Attractiveness and brightness. The attractiveness of a firefly is determined by its brightness, which is associated with the objective function. The firefly with the less bright is attracted to the brighter firefly. The brightness (light intensity) I of firefly decreases with the distance from its source, and light is absorbed by the environment. It is known that light intensity I(r) varies following the inverse square law as follows.
Where I 0 is the light intensity at the source, r is the distance between any two fireflies. With the fixed light absorption coefficient and to avoid the singularity at r = 0 in the expression in 2. The combined effect of both the inverse square low and absorption can be approximated as the following Gaussian form Since a firefly, attractiveness is proportional to the light intensity seen by adjacent fireflies, the attractiveness function of the firefly can be defined by: Where B 0 is the initial attractiveness at r = 0, γ is the media light absorption coefficient.
3.3. The distance between two fireflies. At the position, x i and x j , the distance between any two fireflies i and j can be defined as Euclidian or Cartesian distance as follows [45], [44], [23].
Where x (i,k) is the k th component of spatial coordinates x i of i th firefly and d is the number of dimensions. For d = 2, Equation 2 can be defined as follows.
3.4. Firefly movement. The firefly i is attracted and moves to the firefly j if the firefly j is brighter than firefly i. The movement of the firefly i to firefly j can be defined as follows.
3.5. Firefly algorithm. The main steps of the firefly algorithm are presented in Algorithm 1. In Algorithm 1, the firefly algorithm stars by setting the initial values of the main parameters in the algorithms such as the randomization parameter α, firefly attractiveness β 0 , media light absorption coefficient γ, population size P and maximum generation number MGN. The initial population is generated randomly, and the objective function is calculated for each solution in the population. The following steps are repeated until the termination criterion satisfied which is to reach the desired number of iterations MGN. For each solution x i and x j , i = 1, . . . , P and j = 1, . . . , i. If the objective function of firefly j is better than the objective function of firefly i, the firefly i will move towards the firefly j as shown in Equation 7. The attractiveness varies with distance r via exp(−γr 2 ) as shown in Equation 4. Each solution x i is evaluated in the population, and the corresponding light intensity I i of each solution is updated. The best-found solution is assigned x best and produced as an optimal solution. for (i = 1; i < P ; i++) do 8:

Main concepts.
The cuckoo search (CS) algorithm is a nature-inspired population-based algorithm inspired from the reproduction strategy of the cuckoo birds. The cuckoo birds lay their eggs in communal nests, and they may remove other's eggs to increase the probability of hatching their own eggs [32].
Some host bird can discover the eggs are not its own and throw these eggs away or abandons its nest and build a new nest in a new place. Some kind cuckoo birds can mimic the color and the pattern of the eggs of a few host birds to reduce the probability of discovering the intruding eggs. The cuckoos laid their eggs in a nest where the host bird just laid its eggs since the cuckoo eggs are hatching earlier than the host bird eggs. Once the eggs are hatching, the cuckoo chick's starts to propel the host eggs out of the nest to increase its share of food provided by its host bird.

4.2.
Lèvy flights. Lèvy flight [5] is a random walk in which the step-lengths are distributed according to a heavy-tailed probability distribution. After a large number of steps, the distance from the origin of the random walk tends to a stable distribution.

4.3.
Cuckoo search characteristic. The cuckoo search algorithm is based on the following rules 1. At a time, cuckoo chooses a nest randomly to lay an egg. 2. The best nest is the nest with high-quality eggs, and it will carry over to the next generation.
3. The number of available host nests is fixed. The probability of discovering an intruding egg by the host bird is P a ∈ [0, 1]. 4. If the host bird discovers the intruding egg, it throws it away from the nest or abandon the nest and starts to build a new nest elsewhere.

4.4.
Cuckoo search algorithm. The main steps of the standard cuckoo search algorithm are presented in Algorithm 2. These steps are summarized as follows.
The standard cuckoo search algorithm starts by setting the initial values of the main parameters such as the initial values of population size P , probability P a ∈ [0, 1], the maximum number of iterations MGN and the initial iteration counter t is setting. The initial population P is generated randomly, and each solution x i in the population is evaluated by calculating its fitness function f (x i ). The following steps are repeated until the termination criterion satisfied. A new solution is generated randomly using a Lèvy flight as follows.
Where ⊕ denotes entry-wise multiplication, α is the step size, and Lèvy (λ) is the Lèvy distribution.
The new solution is replaced with a randomly selected solution if its objective function is better than the objective function of the selected random solution. A fraction P a is generated randomly to abandon and replace the worst solutions by generating new solutions using local random walks as follows.
Where x t j and x t k are two different solutions selected randomly and γ is a random number.
The solutions are ranked according to their objective values, and the best solution is assigned and the iteration counter increases. The operation is repeated until the termination criteria are satisfied. Finally, the best-found solution is produced. 5. The proposed HFFACS algorithm. In the section, the proposed algorithm is presented as shown in Algorithm 3. The proposed algorithm is summarized as follows.
• Initialization. The proposed algorithm starts by setting all initial values of its parameters such as the randomization parameter α, firefly attractiveness β 0 , media light absorption coefficient γ, population size P , probability P a ∈ [0, 1] and maximum generation number MGN. • Solution evaluation. The initial population is generated randomly, and each solution in the population is evaluated by calculating its objective function f (x i ). Generate initial population of P host x (t) i . {n is the population size}.

5:
Evaluate the fitness function f (x  Choose a nest x j among n solutions randomly. 11: j )) then

12:
Replace the solution x j with the solution x Abandon a fraction p a of worse nests.

15:
Build new nests at new locations using Lévy flight a fraction p a of worse nests 16: Keep the best solutions (nests with quality solutions)

17:
Rank the solutions and find the current best solution generating new solutions using a Lèvy flight as shown in Equation 8. The Lèvy flight operator has a strong ability of the exploration process and can help the proposed algorithm to find the global minimum of the tested functions in a reasonable time.
• termination criterion.The proposed algorithm is terminated when it reaches to the maximum number of iterations MGN. Finally, the best solution is obtained.
5.1. Constraints handling. In this paper, a non-stationary penalty function has been used, where the values of the penalty function are dynamically changed during the search process. A general form of the penalty function is defined as follows: Where f (x) is the objective function, h(k) is a nonstationary (dynamically modified) penalty function, k is the current iteration number, and H(x) is a penalty factor, which is calculated as follows.
Where q i (x) = max(0, g i (x)), i = 1, . . . , m, g i are the constrains of the problem, q i is a relative violated function of the constraints, θ(q i (x)) is the power of the penalty function, the values of the functions h(.), θ(.) and γ(.) are problem for (i = 1; i < P ; i++) do 8: for (j = 1; j < i; j++) do Abandon a fraction p a of worse nests.

18:
Build new nests at new locations using Lévy flight a fraction p a of worse nests as shown in Equation 8 19:

20:
Rank the solutions and find the current best solution dependent. The following values are used for the penalty function: Where the assignment function is and the penalty value h(t) = t * √ t 6. Numerical experiments. In order to investigate the efficiency of the proposed HFFACS, the general performance of the HFFACS is presented on different benchmark functions and compared the results of the proposed algorithm with variant particle swarm optimization algorithms. HFFACS was programmed in MATLAB, the results of the comparative algorithms are taken from their original papers. In the following subsections, the parameters setting of the proposed algorithm and the optimal values of the applied test functions are reported in Table 2. Also, the  Table 1. These values are based on the common setting in the literature or determined through our preliminary numerical experiments.

Test constrained optimization problems.
The HFFACS algorithm, is tested on 6 constrained optimization problems f 1 − f 6 , which are reported in Table  2. The optimal values of these benchmark problems are reported in Table 3 and the functions with their definitions are reported as follows.
6.3. The general performance of the proposed HFFACS. In order to investigate the general performance of the proposed algorithm with the constrained optimization problems, the best, worst, mean standard deviation (st.d) and the success rate (succ) of function evaluations are reported in Table 4 over 30 runs. The run is considered successes if the algorithm reached the global minimum for each function within 100,000 function evaluation value. The general performance of the functions f 1 − f 6 is shown in Figure 1 by plotting the number of iterations versus the function values after 100 iterations. Figure 1 shows that the function values rapidly decrease as the number of iterations increases. It can be concluded from Table 4 and Figure 1, that the proposed algorithm is a promising algorithm and can obtain the optimal or near optimal solution within few thousands of function evaluation values. 6.4. Comparison between the Cuckoo search, Firefly algorithms, and the proposed HFFACS. In order to investigate the efficiency of the proposed algorithm, we compare it against the standard cuckoo search (CS) and the firefly (FA) algorithms. The combination between the cuckoo search and the firefly algorithms can accelerate the convergence of the proposed HFFACS algorithm. The mean of function evaluation values of the three algorithms is reported in Table 5. The results in Table 5 show that the proposed HFFACS obtain the global minimum of all testes functions faster than the other two standard algorithms. We can conclude from Table 5, that the combination of the standard cuckoo search and the firefly algorithms in the proposed HFFACS can accelerate the convergence of it.
6.5. HFFACS and other algorithms. HFFACS is compared with four benchmark algorithms (particle swarm optimization with different variants algorithms) in order to verify the efficiency of it. Before discussing the comparison results of all Problem3 [14] Problem4 [14]  Problem6 [25]    algorithms, a brief description of the comparative four algorithms [33] is given as follows.
• RWMPSOg. RWMPSOg is a Random Walk Memetic Particle Swarm Optimization (with global variant), which combines the particle swarm optimization with a random walk with direction exploitation. • RWMPSOl. RWMPSOl is a Random Walk Memetic Particle Swarm Optimization (with local variant), which combines the particle swarm optimization with a random walk with direction exploitation. • PSOg. PSOg is a standard particle swarm optimization with global variant without local search method. • PSOl. PSOl is a standard particle swarm optimization with local variant without local search method.
6.6. Comparison between RWMPSOg, RWMPSOl, PSOg, PSOl, and HFFACS. In this subsection, the comparison results between the proposed HF-FACS algorithm and the other algorithms is presented in order to verify the efficiency of our proposed algorithm. The five comparative algorithms are tested on six benchmark functions f 1 − f 6 , which are reported in Table 2. The results of the comparative algorithms are reported from their original papers [33]. The average (mean), standard deviation maximum (St.d), and success rate (% Suc) of the function values are reported over 30 runs in Table 6. The run is considered successes if the algorithm reached the global minimum of the solution within 100,000 function evaluation value. The best results of the comparative algorithms are reported with boldface text. The results in Table 6 show that the proposed HFFACS algorithm successes in all runs and obtains the desired objective value of each function faster than the other algorithms.

7.
Engineering optimization problems. In order to verify the efficiency of the proposed algorithm, we test it on other five real-world engineering optimization problems. These problems are presented in the following subsections as follows.
7.1. Welded beam design problem. The first problem is a welded beam design problem which aims to minimize the cost of the welded beam structure. The problem consists of a beam A and the required weld to hold it to member B subject to constraints on shear stress τ , beam bending stress θ, buckling load on the bar P c , beam end deflection δ. The four design variables of the problem h(x 1 ), l(x 2 ), t(x 3 ) and b(x 4 ) are shown in Figure 2. Where the other parameters are defined as follows.

7.2.
Pressure vessel design problem. The second problem is a cylindrical vessel. The objective of this problem is minimizing the total cost of material, forming and welding. The problem has a four design variables which are shown in Figure 3, T s (x 1 , thickness of the shell), T h (x 2 , thickness of the head), R (x 3 , inner radius) and L (x 4 , length of the cylindrical section of the vessel without the head). T s and T h are integer multiples of 0.0625in, R and L are continuous variables.
7.3. Speed reducer design problem. The third problem is a speed reducer problem which aim to minimize a gear box volume and weight subject to several constrains. The speed reducer is shown in Figure 4. The problem has a seven design variables x 1 − x 7 , which describe as follow. x 1 is a width of the gear face (cm), x 2 teeth module (cm), x 3 number of pinion teeth, x 4 shaft 1 length between bearings Subject to Figure 5. Spring design problem 7.4. Three-bar truss design problem. The fourth problem is the three-bar truss design problem. The objective of this problem is to minimize the weight f (x) of three bar truss subject to some constraints as follow.
7.5. Spring design problem. The last problem is a spring design problem. The objective of the problem is to minimize the weight f (x) of it subject to constraints on minimum deflection, shear stress, surge frequency, limits on outside diameter and on design variable. The mean coil diameter D(x 2 ), the wire diameter d(x 1 ) and the number of active coils P (x 3 are the design variables as shown in Figure 5. f (x) = (x 3 + 2)x 2 x 2 1 Subject to Where 0.05 ≤ x 1 ≤ 2, 0.25 ≤ x 2 ≤ 1.3 and 2 ≤ x 3 ≤ 15.
7.6. The general performance of the proposed HFFACS algorithm with engineering optimization problems. The convergence curves of the performance of the proposed algorithm with the five engineering optimization problems   are shown in Figure 6 In Figure 6, we plot the number of iterations versus the number of function values. The best mean worst and standard deviation of the function evaluations and function values are reported over 30 runs in Tables 7 and  8 respectively. We can conclude from Figure 6, Tables 7 and 8 that the proposed algorithm can reach to the global minimum of the tested functions within a few number of iterations.
8. Conclusion and future work. A new hybrid nature-inspired algorithm is presented in this paper to solve constrained and real-world engineering optimization problems. The proposed algorithm is called Hybrid Firefly Algorithm and Cuckoo Search (HFFACS) algorithm. Invoking cuckoo search algorithm with a Lèvy flight improves the performance of the firefly algorithm and helps it to avoid premature convergence and to trap in local minima. In order to examine the general performance of the HFFACS algorithm, it tested on six benchmarks constrained and five real-world engineering problems and compared against four algorithms. The experimental results show the HFFACS performance is faster than the other four algorithms and can obtain the optimal or near optimal solution within a reasonable time. Also, we would like to apply our proposed algorithm on solving unconstrained optimization problems [1], large scale problems and molecular potential energy function [42], [41], team formation problem [8], and minimax and integer programming problems [2], [39], [40]. Furthermore, we would like to use binary version to solve feature selection problems [37], [38].