# American Institute of Mathematical Sciences

• Previous Article
A numerical method to compute Fisher information for a special case of heterogeneous negative binomial regression
• CPAA Home
• This Issue
• Next Article
Computing eigenpairs of two-parameter Sturm-Liouville systems using the bivariate sinc-Gauss formula
August  2020, 19(8): 4159-4177. doi: 10.3934/cpaa.2020186

## Kernel-based maximum correntropy criterion with gradient descent method

 School of Mathematics and Statistics, Wuhan University, Wuhan, China

Received  September 2019 Revised  December 2019 Published  May 2020

Fund Project: The author is supported by NSFC grant 11671307 and 11571078

In this paper, we study the convergence of the gradient descent method for the maximum correntropy criterion (MCC) associated with reproducing kernel Hilbert spaces (RKHSs). MCC is widely used in many real-world applications because of its robustness and ability to deal with non-Gaussian impulse noises. In the regression context, we show that the gradient descent iterates of MCC can approximate the target function and derive the capacity-dependent convergence rate by taking a suitable iteration number. Our result can nearly match the optimal convergence rate stated in the previous work, and in which we can see that the scaling parameter is crucial to MCC's approximation ability and robustness property. The novelty of our work lies in a sharp estimate for the norms of the gradient descent iterates and the projection operation on the last iterate.

Citation: Ting Hu. Kernel-based maximum correntropy criterion with gradient descent method. Communications on Pure & Applied Analysis, 2020, 19 (8) : 4159-4177. doi: 10.3934/cpaa.2020186
##### References:

show all references

##### References:
 [1] Bingzheng Li, Zhengzhan Dai. Error analysis on regularized regression based on the Maximum correntropy criterion. Mathematical Foundations of Computing, 2020, 3 (1) : 25-40. doi: 10.3934/mfc.2020003 [2] Xiaming Chen. Kernel-based online gradient descent using distributed approach. Mathematical Foundations of Computing, 2019, 2 (1) : 1-9. doi: 10.3934/mfc.2019001 [3] Kaitlyn (Voccola) Muller. A reproducing kernel Hilbert space framework for inverse scattering problems within the Born approximation. Inverse Problems & Imaging, 2019, 13 (6) : 1327-1348. doi: 10.3934/ipi.2019058 [4] Ying Lin, Rongrong Lin, Qi Ye. Sparse regularized learning in the reproducing kernel banach spaces with the $\ell^1$ norm. Mathematical Foundations of Computing, 2020, 3 (3) : 205-218. doi: 10.3934/mfc.2020020 [5] Ali Akgül, Mustafa Inc, Esra Karatas. Reproducing kernel functions for difference equations. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1055-1064. doi: 10.3934/dcdss.2015.8.1055 [6] Ali Akgül. A new application of the reproducing kernel method. Discrete & Continuous Dynamical Systems - S, 2020  doi: 10.3934/dcdss.2020261 [7] Sylvia Serfaty. Gamma-convergence of gradient flows on Hilbert and metric spaces and applications. Discrete & Continuous Dynamical Systems - A, 2011, 31 (4) : 1427-1451. doi: 10.3934/dcds.2011.31.1427 [8] Irene Benedetti, Luisa Malaguti, Valentina Taddei. Nonlocal problems in Hilbert spaces. Conference Publications, 2015, 2015 (special) : 103-111. doi: 10.3934/proc.2015.0103 [9] Fritz Gesztesy, Rudi Weikard, Maxim Zinchenko. On a class of model Hilbert spaces. Discrete & Continuous Dynamical Systems - A, 2013, 33 (11&12) : 5067-5088. doi: 10.3934/dcds.2013.33.5067 [10] Zhiming Li, Yujun Zhu. Entropies of commuting transformations on Hilbert spaces. Discrete & Continuous Dynamical Systems - A, 2020, 40 (10) : 5795-5814. doi: 10.3934/dcds.2020246 [11] Feng Bao, Thomas Maier. Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems. Foundations of Data Science, 2020, 2 (1) : 1-17. doi: 10.3934/fods.2020001 [12] Shishun Li, Zhengda Huang. Guaranteed descent conjugate gradient methods with modified secant condition. Journal of Industrial & Management Optimization, 2008, 4 (4) : 739-755. doi: 10.3934/jimo.2008.4.739 [13] Wataru Nakamura, Yasushi Narushima, Hiroshi Yabe. Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. Journal of Industrial & Management Optimization, 2013, 9 (3) : 595-619. doi: 10.3934/jimo.2013.9.595 [14] Liam Burrows, Weihong Guo, Ke Chen, Francesco Torella. Reproducible kernel Hilbert space based global and local image segmentation. Inverse Problems & Imaging, , () : -. doi: 10.3934/ipi.2020048 [15] Hanbing Liu, Yongdong Huang, Chongjun Li. Weaving K-fusion frames in hilbert spaces. Mathematical Foundations of Computing, 2020, 3 (2) : 101-116. doi: 10.3934/mfc.2020008 [16] Jin-Mun Jeong, Seong-Ho Cho. Identification problems of retarded differential systems in Hilbert spaces. Evolution Equations & Control Theory, 2017, 6 (1) : 77-91. doi: 10.3934/eect.2017005 [17] Giuseppe Da Prato, Franco Flandoli. Some results for pathwise uniqueness in Hilbert spaces. Communications on Pure & Applied Analysis, 2014, 13 (5) : 1789-1797. doi: 10.3934/cpaa.2014.13.1789 [18] Guangcun Lu. The splitting lemmas for nonsmooth functionals on Hilbert spaces I. Discrete & Continuous Dynamical Systems - A, 2013, 33 (7) : 2939-2990. doi: 10.3934/dcds.2013.33.2939 [19] Bernd Hofmann, Barbara Kaltenbacher, Elena Resmerita. Lavrentiev's regularization method in Hilbert spaces revisited. Inverse Problems & Imaging, 2016, 10 (3) : 741-764. doi: 10.3934/ipi.2016019 [20] Raffaele Chiappinelli. Eigenvalues of homogeneous gradient mappings in Hilbert space and the Birkoff-Kellogg theorem. Conference Publications, 2007, 2007 (Special) : 260-268. doi: 10.3934/proc.2007.2007.260

2019 Impact Factor: 1.105