Article Contents
Article Contents

# Some worst-case datasets of deterministic first-order methods for solving binary logistic regression

• * Corresponding author: Yuyuan Ouyang

The authors are supported by National Science Foundation grant DMS-1913006 and Office of Naval Research grant N00014-19-1-2295

• We present in this paper some worst-case datasets of deterministic first-order methods for solving large-scale binary logistic regression problems. Under the assumption that the number of algorithm iterations is much smaller than the problem dimension, with our worst-case datasets it requires at least ${{{\mathcal O}}}(1/\sqrt{\varepsilon})$ first-order oracle inquiries to compute an $\varepsilon$-approximate solution. From traditional iteration complexity analysis point of view, the binary logistic regression loss functions with our worst-case datasets are new worst-case function instances among the class of smooth convex optimization problems.

Mathematics Subject Classification: Primary: 90C06, 90C25, 90C30.

 Citation:

•  [1] F. Bach, Self-concordant analysis for logistic regression, Electronic Journal of Statistics, 4 (2010), 384-414.  doi: 10.1214/09-EJS521. [2] Y. Carmon, J. C Duchi, O. Hinder and A. Sidford, Lower bounds for finding stationary points Ⅰ, Mathematical Programming, 2019, 1-50. doi: 10.1007/s10107-019-01406-y. [3] Y. Carmon, J. C Duchi, O. Hinder and A. Sidford, Lower bounds for finding stationary points Ⅱ: First-order methods, Mathematical Programming, 2019, 1-41. doi: 10.1007/s10107-019-01431-x. [4] J. Diakonikolas and C. Guzmán, Lower bounds for parallel and randomized convex optimization, In Conference on Learning Theory, 2019, 1132-1157. [5] Y. Drori, The exact information-based complexity of smooth convex minimization, Journal of Complexity, 39 (2017), 1-16.  doi: 10.1016/j.jco.2016.11.001. [6] C. Guzmán and A. Nemirovski, On lower complexity bounds for large-scale smooth convex optimization, Journal of Complexity, 31 (2015), 1-14.  doi: 10.1016/j.jco.2014.08.003. [7] A. Juditsky and Y. Nesterov, Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization, Stochastic Systems, 4 (2014), 44-80.  doi: 10.1214/10-SSY010. [8] A. Nemirovski and D. Yudin, Problem Complexity and Method Efficiency in Optimization, Wiley-Interscience Series in Discrete Mathematics. John Wiley, XV, 1983. [9] A. S. Nemirovski, Information-based complexity of linear operator equations, Journal of Complexity, 8 (1992), 153-175.  doi: 10.1016/0885-064X(92)90013-2. [10] Y. E. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Massachusetts, 2004. doi: 10.1007/978-1-4419-8853-9. [11] Y. Ouyang and Y. Xu, Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems, Mathematical Programming, 2019, 1-35. doi: 10.1007/s10107-019-01420-0. [12] B. E Woodworth and N. Srebro, Tight complexity bounds for optimizing composite objectives, In Advances in Neural Information Processing Systems, 2016, 3639-3647.