# American Institute of Mathematical Sciences

August  2020, 19(8): 4069-4083. doi: 10.3934/cpaa.2020180

## Quantitative convergence analysis of kernel based large-margin unified machines

 1 Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong, China 2 Department of Mathematics, Zhejiang Normal University, Jinhua, Zhejiang 321004, China

* Corresponding author

Received  August 2019 Revised  September 2019 Published  May 2020

Fund Project: The work by J. Fan is partially supported by the Hong Kong RGC ECS grant 22303518, HKBU FRG grant FRG2/17-18/091 and the NSF grant of China (No. 11801478). The work by D. H. Xiang is supported by the National Natural Science Foundation of China under Grant 11871438 and 11771120

High-dimensional binary classification has been intensively studied in the community of machine learning in the last few decades. Support vector machine (SVM), one of the most popular classifier, depends on only a portion of training samples called support vectors which leads to suboptimal performance in the setting of high dimension and low sample size (HDLSS). Large-margin unified machines (LUMs) are a family of margin-based classifiers proposed to solve the so-called "data piling" problem which is inherent in SVM under HDLSS settings. In this paper we study the binary classification algorithms associated with LUM loss functions in the framework of reproducing kernel Hilbert spaces. Quantitative convergence analysis has been carried out for these algorithms by means of a novel application of projection operators to overcome the technical difficulty. The rates are explicitly derived under priori conditions on approximation and capacity of the reproducing kernel Hilbert space.

Citation: Jun Fan, Dao-Hong Xiang. Quantitative convergence analysis of kernel based large-margin unified machines. Communications on Pure & Applied Analysis, 2020, 19 (8) : 4069-4083. doi: 10.3934/cpaa.2020180
##### References:

show all references

##### References:
 [1] Stefan Kindermann, Antonio Leitão. Convergence rates for Kaczmarz-type regularization methods. Inverse Problems & Imaging, 2014, 8 (1) : 149-172. doi: 10.3934/ipi.2014.8.149 [2] De-han Chen, Daijun jiang. Convergence rates of Tikhonov regularization for recovering growth rates in a Lotka-Volterra competition model with diffusion. Inverse Problems & Imaging, 2021, 15 (5) : 951-974. doi: 10.3934/ipi.2021023 [3] Philippe Angot, Pierre Fabrie. Convergence results for the vector penalty-projection and two-step artificial compressibility methods. Discrete & Continuous Dynamical Systems - B, 2012, 17 (5) : 1383-1405. doi: 10.3934/dcdsb.2012.17.1383 [4] Markus Grasmair. Well-posedness and convergence rates for sparse regularization with sublinear $l^q$ penalty term. Inverse Problems & Imaging, 2009, 3 (3) : 383-387. doi: 10.3934/ipi.2009.3.383 [5] Guozhi Dong, Bert Jüttler, Otmar Scherzer, Thomas Takacs. Convergence of Tikhonov regularization for solving ill-posed operator equations with solutions defined on surfaces. Inverse Problems & Imaging, 2017, 11 (2) : 221-246. doi: 10.3934/ipi.2017011 [6] Matteo Bonforte, Jean Dolbeault, Matteo Muratori, Bruno Nazaret. Weighted fast diffusion equations (Part Ⅱ): Sharp asymptotic rates of convergence in relative error by entropy methods. Kinetic & Related Models, 2017, 10 (1) : 61-91. doi: 10.3934/krm.2017003 [7] James Broda, Alexander Grigo, Nikola P. Petrov. Convergence rates for semistochastic processes. Discrete & Continuous Dynamical Systems - B, 2019, 24 (1) : 109-125. doi: 10.3934/dcdsb.2019001 [8] Thomas Schuster, Joachim Weickert. On the application of projection methods for computing optical flow fields. Inverse Problems & Imaging, 2007, 1 (4) : 673-690. doi: 10.3934/ipi.2007.1.673 [9] Dang Van Hieu. Projection methods for solving split equilibrium problems. Journal of Industrial & Management Optimization, 2020, 16 (5) : 2331-2349. doi: 10.3934/jimo.2019056 [10] Qinghua Ma, Zuoliang Xu, Liping Wang. Recovery of the local volatility function using regularization and a gradient projection method. Journal of Industrial & Management Optimization, 2015, 11 (2) : 421-437. doi: 10.3934/jimo.2015.11.421 [11] Richard A. Norton, David I. McLaren, G. R. W. Quispel, Ari Stern, Antonella Zanna. Projection methods and discrete gradient methods for preserving first integrals of ODEs. Discrete & Continuous Dynamical Systems, 2015, 35 (5) : 2079-2098. doi: 10.3934/dcds.2015.35.2079 [12] Ole Løseth Elvetun, Bjørn Fredrik Nielsen. A regularization operator for source identification for elliptic PDEs. Inverse Problems & Imaging, 2021, 15 (4) : 599-618. doi: 10.3934/ipi.2021006 [13] Xiangtuan Xiong, Jinmei Li, Jin Wen. Some novel linear regularization methods for a deblurring problem. Inverse Problems & Imaging, 2017, 11 (2) : 403-426. doi: 10.3934/ipi.2017019 [14] Baohuai Sheng, Huanxiang Liu, Huimin Wang. Learning rates for the kernel regularized regression with a differentiable strongly convex loss. Communications on Pure & Applied Analysis, 2020, 19 (8) : 3973-4005. doi: 10.3934/cpaa.2020176 [15] Stefan Kindermann, Andreas Neubauer. On the convergence of the quasioptimality criterion for (iterated) Tikhonov regularization. Inverse Problems & Imaging, 2008, 2 (2) : 291-299. doi: 10.3934/ipi.2008.2.291 [16] Jae-Hong Pyo, Jie Shen. Normal mode analysis of second-order projection methods for incompressible flows. Discrete & Continuous Dynamical Systems - B, 2005, 5 (3) : 817-840. doi: 10.3934/dcdsb.2005.5.817 [17] Yazheng Dang, Fanwen Meng, Jie Sun. Convergence analysis of a parallel projection algorithm for solving convex feasibility problems. Numerical Algebra, Control & Optimization, 2016, 6 (4) : 505-519. doi: 10.3934/naco.2016023 [18] Frank Blume. Minimal rates of entropy convergence for rank one systems. Discrete & Continuous Dynamical Systems, 2000, 6 (4) : 773-796. doi: 10.3934/dcds.2000.6.773 [19] Jie Zhao. Convergence rates for elliptic reiterated homogenization problems. Communications on Pure & Applied Analysis, 2013, 12 (6) : 2787-2795. doi: 10.3934/cpaa.2013.12.2787 [20] Wilhelm Schlag. Regularity and convergence rates for the Lyapunov exponents of linear cocycles. Journal of Modern Dynamics, 2013, 7 (4) : 619-637. doi: 10.3934/jmd.2013.7.619

2020 Impact Factor: 1.916