# American Institute of Mathematical Sciences

October  2017, 13(4): 1841-1857. doi: 10.3934/jimo.2017021

## Incremental gradient-free method for nonsmooth distributed optimization

 1 School of Mathematical Sciences, Chongqing Normal University, Chongqing, 400047, China 2 Australasian Joint Research Center for Building Information Modelling, School of Built Environment, Curtin University, Bentley, WA, 6102, Australia 3 Department of Naval Architecture and Ocean Engineering, Pusan National University, Busan, Korea

Received  March 2015 Revised  August 2016 Published  December 2016

Fund Project: This research was partially supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) through GCRC-SOP (No. 2011-0030013), the Natural Science Foundation of China (11501070,11401064,11471062 and 61473326), by the Natural Science Foundation Projection of Chongqing (cstc2015jcyjA00011, cstc2013jjB00001 and cstc2013jcyjA00029), by the Chongqing Municipal Education Commission under Grant KJ1500301 and KJ1500302, and by the Chongqing Normal University Research Foundation 15XLB005.

In this paper we consider the minimization of the sum of local convex component functions distributed over a multi-agent network. We first extend the Nesterov's random gradient-free method to the incremental setting. Then we propose the incremental gradient-free methods, including a cyclic order and a randomized order in the selection of component function. We provide the convergence and iteration complexity analysis of the proposed methods under some suitable stepsize rules. To illustrate our proposed methods, extensive numerical results on a distributed $l_1$-regression problem are presented. Compared with existing incremental subgradient-based methods, our methods only require the evaluation of the function values rather than subgradients, which may be preferred by practical engineers.

Citation: Jueyou Li, Guoquan Li, Zhiyou Wu, Changzhi Wu, Xiangyu Wang, Jae-Myung Lee, Kwang-Hyo Jung. Incremental gradient-free method for nonsmooth distributed optimization. Journal of Industrial & Management Optimization, 2017, 13 (4) : 1841-1857. doi: 10.3934/jimo.2017021
##### References:

show all references

##### References:
Function value error versus number of cycles $K$ with a constant stepsize $\alpha=0.001$ for algorithms CIGF and CISG
Function value error versus number of iterations $N$ with a constant stepsize $\alpha=0.001$ for algorithms RIGF and RISG
(a) Function value error versus number of cycles $K$ with diminishing stepsize choices: $\alpha_{1}(k)={1}/{(m(k-1)+i)}, ~ \alpha_{2}(k)={1}/{(m(k-1)+i)^{\frac{2}{3}}}, k=0, 1, \ldots, i=1, \ldots, m$; (b) Function value error versus number of iterations $N$ with diminishing stepsize choices: $\theta_{1}(k)={1}/{k}, ~\theta_{2}(k)={0.1}/{k^{\frac{2}{3}}}, k=1, 2, \ldots$
For a fixed target accuracy $\epsilon=0.01$ and a constant stepsize $\alpha=0.001$, comparisons between algorithms CIGF and CISG: (a) number of iterations $N$ versus dimensions of the agent $d$ for fixed $m=100$; (b) number of iterations $N$ versus number of agents $m$ for fixed $d=2$
 [1] Samuel Amstutz, Antonio André Novotny, Nicolas Van Goethem. Minimal partitions and image classification using a gradient-free perimeter approximation. Inverse Problems & Imaging, 2014, 8 (2) : 361-387. doi: 10.3934/ipi.2014.8.361 [2] Igor Griva, Roman A. Polyak. Proximal point nonlinear rescaling method for convex optimization. Numerical Algebra, Control & Optimization, 2011, 1 (2) : 283-299. doi: 10.3934/naco.2011.1.283 [3] Nobuko Sagara, Masao Fukushima. trust region method for nonsmooth convex optimization. Journal of Industrial & Management Optimization, 2005, 1 (2) : 171-180. doi: 10.3934/jimo.2005.1.171 [4] Zhongwen Chen, Songqiang Qiu, Yujie Jiao. A penalty-free method for equality constrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (2) : 391-409. doi: 10.3934/jimo.2013.9.391 [5] Gaohang Yu, Shanzhou Niu, Jianhua Ma. Multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints. Journal of Industrial & Management Optimization, 2013, 9 (1) : 117-129. doi: 10.3934/jimo.2013.9.117 [6] Nam-Yong Lee, Bradley J. Lucier. Preconditioned conjugate gradient method for boundary artifact-free image deblurring. Inverse Problems & Imaging, 2016, 10 (1) : 195-225. doi: 10.3934/ipi.2016.10.195 [7] Xiu Ye, Shangyou Zhang. A new weak gradient for the stabilizer free weak Galerkin method with polynomial reduction. Discrete & Continuous Dynamical Systems - B, 2020  doi: 10.3934/dcdsb.2020277 [8] Foxiang Liu, Lingling Xu, Yuehong Sun, Deren Han. A proximal alternating direction method for multi-block coupled convex optimization. Journal of Industrial & Management Optimization, 2019, 15 (2) : 723-737. doi: 10.3934/jimo.2018067 [9] Dan Li, Li-Ping Pang, Fang-Fang Guo, Zun-Quan Xia. An alternating linearization method with inexact data for bilevel nonsmooth convex optimization. Journal of Industrial & Management Optimization, 2014, 10 (3) : 859-869. doi: 10.3934/jimo.2014.10.859 [10] Guanghui Zhou, Qin Ni, Meilan Zeng. A scaled conjugate gradient method with moving asymptotes for unconstrained optimization problems. Journal of Industrial & Management Optimization, 2017, 13 (2) : 595-608. doi: 10.3934/jimo.2016034 [11] El-Sayed M.E. Mostafa. A nonlinear conjugate gradient method for a special class of matrix optimization problems. Journal of Industrial & Management Optimization, 2014, 10 (3) : 883-903. doi: 10.3934/jimo.2014.10.883 [12] Bingsheng He, Xiaoming Yuan. Linearized alternating direction method of multipliers with Gaussian back substitution for separable convex programming. Numerical Algebra, Control & Optimization, 2013, 3 (2) : 247-260. doi: 10.3934/naco.2013.3.247 [13] Sanming Liu, Zhijie Wang, Chongyang Liu. Proximal iterative Gaussian smoothing algorithm for a class of nonsmooth convex minimization problems. Numerical Algebra, Control & Optimization, 2015, 5 (1) : 79-89. doi: 10.3934/naco.2015.5.79 [14] A. M. Bagirov, Moumita Ghosh, Dean Webb. A derivative-free method for linearly constrained nonsmooth optimization. Journal of Industrial & Management Optimization, 2006, 2 (3) : 319-338. doi: 10.3934/jimo.2006.2.319 [15] René Henrion. Gradient estimates for Gaussian distribution functions: application to probabilistically constrained optimization problems. Numerical Algebra, Control & Optimization, 2012, 2 (4) : 655-668. doi: 10.3934/naco.2012.2.655 [16] Nurullah Yilmaz, Ahmet Sahiner. On a new smoothing technique for non-smooth, non-convex optimization. Numerical Algebra, Control & Optimization, 2020, 10 (3) : 317-330. doi: 10.3934/naco.2020004 [17] Dongsheng Yin, Min Tang, Shi Jin. The Gaussian beam method for the wigner equation with discontinuous potentials. Inverse Problems & Imaging, 2013, 7 (3) : 1051-1074. doi: 10.3934/ipi.2013.7.1051 [18] Rouhollah Tavakoli, Hongchao Zhang. A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numerical Algebra, Control & Optimization, 2012, 2 (2) : 395-412. doi: 10.3934/naco.2012.2.395 [19] Liang Zhang, Wenyu Sun, Raimundo J. B. de Sampaio, Jinyun Yuan. A wedge trust region method with self-correcting geometry for derivative-free optimization. Numerical Algebra, Control & Optimization, 2015, 5 (2) : 169-184. doi: 10.3934/naco.2015.5.169 [20] Jianjun Zhang, Yunyi Hu, James G. Nagy. A scaled gradient method for digital tomographic image reconstruction. Inverse Problems & Imaging, 2018, 12 (1) : 239-259. doi: 10.3934/ipi.2018010

2019 Impact Factor: 1.366