August  2020, 19(8): 3917-3932. doi: 10.3934/cpaa.2020172

Learning rates for partially linear functional models with high dimensional scalar covariates

1. 

School of Statistics, Southwestern University of Finance and Economics, Chengdu, 611130, China

2. 

School of Statistics and Management, Shanghai University of Finance and Economics, Shanghai, China

3. 

College of Statistics and Mathematics, Nanjing Audit University, Nanjing, China

* Corresponding author

Received  January 2019 Revised  April 2019 Published  May 2020

Fund Project: This work is partly supported by the National Social Science Fund of China(NSSFC-16BTJ013, NSSFC-16ZDA010), Sichuan Social Science Fund (SC14B091) and Sichuan Project of Science and Technology(2017JY0273). Shaogao Lv is the corresponding author and his research is partially supported by NSFC-11871277

This paper is concerned with learning rates for partial linear functional models (PLFM) within reproducing kernel Hilbert spaces (RKHS), where all the covariates consist of two parts: functional-type covariates and scalar ones. As opposed to frequently used functional principal component analysis for functional models, the finite number of basis functions in the proposed approach can be generated automatically by taking advantage of reproducing property of RKHS. This avoids additional computational costs on PCA decomposition and the choice of the number of principal components. Moreover, the coefficient estimators with bounded covariates converge to the true coefficients with linear rates, as if the functional term in PLFM has no effect on the linear part. In contrast, the prediction error for the functional estimator is significantly affected by the ambient dimension of the scalar covariates. Finally, we develop the proposed numerical algorithm for the proposed penalized approach, and some simulated experiments are implemented to support our theoretical results.

Citation: Yifan Xia, Yongchao Hou, Xin He, Shaogao Lv. Learning rates for partially linear functional models with high dimensional scalar covariates. Communications on Pure and Applied Analysis, 2020, 19 (8) : 3917-3932. doi: 10.3934/cpaa.2020172
References:
[1]

N. Aronszajn, Theory of reporudcing kernels, Trans. Amer. Math. Soc., 68 (1950), 337-404.  doi: 10.2307/1990404.

[2]

P. J. BickelY. Ritov and A. B. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist., 37 (2009), 1705-1732.  doi: 10.1214/08-AOS620.

[3]

O. Bousquet, A Bennet concentration inequality and its application to suprema of empirical processes, C. R. Math. Acad. Sci. Paris, 334 (2002), 495-550.  doi: 10.1016/S1631-073X(02)02292-6.

[4] S. Boyd and L. Vandenberghe, Convex optimization, Cambridge University Press, Cambridge, 2004.  doi: 10.1017/CBO9780511804441.
[5]

P. Buhlmann and S. van de Geer, Statistics for High-Dimensional Data: Methods, Theory and Applications, Springer-Verlag Berlin Heidelberg, 2011. doi: 10.1007/978-3-642-20192-9.

[6]

T. T. Cai and P. Hall, Prediction in functional linear regression, Ann. Statist., 34 (2006), 2159-2179.  doi: 10.1214/009053606000000830.

[7]

T. T. Cai and M. Yuan, Minimax and adaptive prediction for functional linear regression, J. Amer. Statist. Assoc., 107 (2012), 1201-1216.  doi: 10.1080/01621459.2012.716337.

[8]

A. CuevasM. Febrero and R. Fraiman, Linear functional regression: The case of fixed design and functional response, Canadian J. Statist., 30 (2002), 285-300.  doi: 10.2307/3315952.

[9]

D. Donoho, Compressed sensing, IEEE. Trans. Inform. Theory, 52 (2006), 1289-1306.  doi: 10.1109/TIT.2006.871582.

[10]

F. Ferraty and P. Vieu, Nonparametric Functional Data Analysis: Theory and Practice, Springer, New York, 2006.

[11]

X. JiJ. W. HanX. JiangX. T. HuL. GuoJ. G. HanL. Shao and T. M. Liu, Analysis of music/speech via integration of audio content and functional brain response, Inform. Sci., 297 (2015), 271-282.  doi: 10.1016/j.ins.2014.11.020.

[12]

D. H. KongJ. G. IbrahimE. Lee and H. Zhu, FLCRM: Functional linear cox regression model, Biometrics, 74 (2018), 109-117.  doi: 10.1111/biom.12748.

[13]

D. H. KongK. J. XueF. Yao and H. H. Zhang, Partially functional linear regression in high dimensions, Biometrika, 103 (2016), 147-159.  doi: 10.1093/biomet/asv062.

[14]

M. Ledoux, The Concentration of Measure Phenomenon, Mathematical Surveys and Monographs, American Mathematical Society, Providence, RI, 2001.

[15]

P. Muller and S. Van de Geer, The partial linear model in high dimensions, Scand. J. Statist., 42 (2015), 580-608.  doi: 10.1111/sjos.12124.

[16]

C. Preda and G. Saporta, PLS regression on a stochastic process, Comput. Statist. Data Anal., 48 (2005), 149-158.  doi: 10.1016/j.csda.2003.10.003.

[17]

J. O. Ramsay and B. W. Silverman, Functional Data Analysis, 2$^nd$ edition, Springer, New York, 2005.

[18]

E. Sanchez-LozanoG. TzimiropoulosB. MartinezF. De la Torre and M. Valstar, A functional regression approach to facial landmark tracking, IEEE Trans. Pattern Anal. Mach. Intell., 40 (2018), 2037-2050.  doi: 10.1109/TPAMI.2017.2745568.

[19]

H. Shin and M. H. Lee, On prediction rate in partial functional linear regression, J. Multi. Anal., 103 (2012), 93-106.  doi: 10.1016/j.jmva.2011.06.011.

[20]

M. Talagrand, New concentration inequalities in product spaces, Invent. Math., 126 (1996), 505-563.  doi: 10.1007/s002220050108.

[21]

Q. G. Tang and L. S. Cheng, Partial functional linear quantile regression, Sci. China Math., 57 (2014), 2589-2608.  doi: 10.1007/s11425-014-4819-x.

[22]

J. P. Thivierge, Functional data analysis of cognitive events in EEG, IEEE Int. Confer. Syst. Man Cyber., (2007), 2473–2478. doi: 10.1109/ICSMC.2007.4413811.

[23]

R. J. Tibshirani, Regression shrinkage and selection via the lasso, J. R. Statist. Soc. B, 58 (1996), 267-288. 

[24] S. Van. de. Geer, Emprical Processes in M-Estimation, Cambridge University Press, New York, 2000. 
[25]

M. J. Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using $\ell_1$-constrainted quadratic programming, IEEE Trans. Inf. Theory, 55 (2009), 2183-2202.  doi: 10.1109/TIT.2009.2016018.

[26]

F. YaoH. G. Muller and J. L. Wang, Functional data analysis for sparse longitudinal data, J. Amer. Statist. Assoc., 100 (2005), 577-590.  doi: 10.1198/016214504000001745.

[27]

Y. Yuan and T. T. Cai, A reproducing kernel Hilbert space approach to functional linear regression, Ann. Statist., 38 (2006), 3412-3444.  doi: 10.1214/09-AOS772.

[28]

D. D. YuL. L. Kong and I. Mizera, Partial functional linear quantile regression for neuroimaging data analysis, Neurocomputing, 195 (2016), 74-87.  doi: 10.1016/j.neucom.2015.08.116.

show all references

References:
[1]

N. Aronszajn, Theory of reporudcing kernels, Trans. Amer. Math. Soc., 68 (1950), 337-404.  doi: 10.2307/1990404.

[2]

P. J. BickelY. Ritov and A. B. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist., 37 (2009), 1705-1732.  doi: 10.1214/08-AOS620.

[3]

O. Bousquet, A Bennet concentration inequality and its application to suprema of empirical processes, C. R. Math. Acad. Sci. Paris, 334 (2002), 495-550.  doi: 10.1016/S1631-073X(02)02292-6.

[4] S. Boyd and L. Vandenberghe, Convex optimization, Cambridge University Press, Cambridge, 2004.  doi: 10.1017/CBO9780511804441.
[5]

P. Buhlmann and S. van de Geer, Statistics for High-Dimensional Data: Methods, Theory and Applications, Springer-Verlag Berlin Heidelberg, 2011. doi: 10.1007/978-3-642-20192-9.

[6]

T. T. Cai and P. Hall, Prediction in functional linear regression, Ann. Statist., 34 (2006), 2159-2179.  doi: 10.1214/009053606000000830.

[7]

T. T. Cai and M. Yuan, Minimax and adaptive prediction for functional linear regression, J. Amer. Statist. Assoc., 107 (2012), 1201-1216.  doi: 10.1080/01621459.2012.716337.

[8]

A. CuevasM. Febrero and R. Fraiman, Linear functional regression: The case of fixed design and functional response, Canadian J. Statist., 30 (2002), 285-300.  doi: 10.2307/3315952.

[9]

D. Donoho, Compressed sensing, IEEE. Trans. Inform. Theory, 52 (2006), 1289-1306.  doi: 10.1109/TIT.2006.871582.

[10]

F. Ferraty and P. Vieu, Nonparametric Functional Data Analysis: Theory and Practice, Springer, New York, 2006.

[11]

X. JiJ. W. HanX. JiangX. T. HuL. GuoJ. G. HanL. Shao and T. M. Liu, Analysis of music/speech via integration of audio content and functional brain response, Inform. Sci., 297 (2015), 271-282.  doi: 10.1016/j.ins.2014.11.020.

[12]

D. H. KongJ. G. IbrahimE. Lee and H. Zhu, FLCRM: Functional linear cox regression model, Biometrics, 74 (2018), 109-117.  doi: 10.1111/biom.12748.

[13]

D. H. KongK. J. XueF. Yao and H. H. Zhang, Partially functional linear regression in high dimensions, Biometrika, 103 (2016), 147-159.  doi: 10.1093/biomet/asv062.

[14]

M. Ledoux, The Concentration of Measure Phenomenon, Mathematical Surveys and Monographs, American Mathematical Society, Providence, RI, 2001.

[15]

P. Muller and S. Van de Geer, The partial linear model in high dimensions, Scand. J. Statist., 42 (2015), 580-608.  doi: 10.1111/sjos.12124.

[16]

C. Preda and G. Saporta, PLS regression on a stochastic process, Comput. Statist. Data Anal., 48 (2005), 149-158.  doi: 10.1016/j.csda.2003.10.003.

[17]

J. O. Ramsay and B. W. Silverman, Functional Data Analysis, 2$^nd$ edition, Springer, New York, 2005.

[18]

E. Sanchez-LozanoG. TzimiropoulosB. MartinezF. De la Torre and M. Valstar, A functional regression approach to facial landmark tracking, IEEE Trans. Pattern Anal. Mach. Intell., 40 (2018), 2037-2050.  doi: 10.1109/TPAMI.2017.2745568.

[19]

H. Shin and M. H. Lee, On prediction rate in partial functional linear regression, J. Multi. Anal., 103 (2012), 93-106.  doi: 10.1016/j.jmva.2011.06.011.

[20]

M. Talagrand, New concentration inequalities in product spaces, Invent. Math., 126 (1996), 505-563.  doi: 10.1007/s002220050108.

[21]

Q. G. Tang and L. S. Cheng, Partial functional linear quantile regression, Sci. China Math., 57 (2014), 2589-2608.  doi: 10.1007/s11425-014-4819-x.

[22]

J. P. Thivierge, Functional data analysis of cognitive events in EEG, IEEE Int. Confer. Syst. Man Cyber., (2007), 2473–2478. doi: 10.1109/ICSMC.2007.4413811.

[23]

R. J. Tibshirani, Regression shrinkage and selection via the lasso, J. R. Statist. Soc. B, 58 (1996), 267-288. 

[24] S. Van. de. Geer, Emprical Processes in M-Estimation, Cambridge University Press, New York, 2000. 
[25]

M. J. Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using $\ell_1$-constrainted quadratic programming, IEEE Trans. Inf. Theory, 55 (2009), 2183-2202.  doi: 10.1109/TIT.2009.2016018.

[26]

F. YaoH. G. Muller and J. L. Wang, Functional data analysis for sparse longitudinal data, J. Amer. Statist. Assoc., 100 (2005), 577-590.  doi: 10.1198/016214504000001745.

[27]

Y. Yuan and T. T. Cai, A reproducing kernel Hilbert space approach to functional linear regression, Ann. Statist., 38 (2006), 3412-3444.  doi: 10.1214/09-AOS772.

[28]

D. D. YuL. L. Kong and I. Mizera, Partial functional linear quantile regression for neuroimaging data analysis, Neurocomputing, 195 (2016), 74-87.  doi: 10.1016/j.neucom.2015.08.116.

Table 1.  The averaged performance measures of the proposed method in simulated example
(n, v) $ \|\widehat{{\bf {\gamma} }}-{\bf {\gamma} }^0\|_2$ $ {E}\|\widehat{f}-f^0\| $ $ \|\widehat{{\mathop{\bf y}}}-{\mathop{\bf y}}\|_2 $
(100, 1.1) 0.3401 (0.0166) 4.0851 (0.0085) 4.3526 (0.0151)
(100, 2) 0.3338 (0.0166) 4.0578 (0.0085) 4.3226 (0.0151)
(100, 4) 0.3232 (0.0166) 4.0329 (0.0085) 4.2887 (0.0151)
(200, 1.1) 0.2235 (0.0136) 4.0797 (0.0088) 4.2708 (0.0137)
(200, 2) 0.2230 (0.0134) 4.0540 (0.0089) 4.2460 (0.0134)
(200, 4) 0.2166 (0.0119) 4.0313 (0.0087) 4.2185 (0.0126)
(n, v) $ \|\widehat{{\bf {\gamma} }}-{\bf {\gamma} }^0\|_2$ $ {E}\|\widehat{f}-f^0\| $ $ \|\widehat{{\mathop{\bf y}}}-{\mathop{\bf y}}\|_2 $
(100, 1.1) 0.3401 (0.0166) 4.0851 (0.0085) 4.3526 (0.0151)
(100, 2) 0.3338 (0.0166) 4.0578 (0.0085) 4.3226 (0.0151)
(100, 4) 0.3232 (0.0166) 4.0329 (0.0085) 4.2887 (0.0151)
(200, 1.1) 0.2235 (0.0136) 4.0797 (0.0088) 4.2708 (0.0137)
(200, 2) 0.2230 (0.0134) 4.0540 (0.0089) 4.2460 (0.0134)
(200, 4) 0.2166 (0.0119) 4.0313 (0.0087) 4.2185 (0.0126)
[1]

Ali Akgül. A new application of the reproducing kernel method. Discrete and Continuous Dynamical Systems - S, 2021, 14 (7) : 2041-2053. doi: 10.3934/dcdss.2020261

[2]

Kaitlyn (Voccola) Muller. A reproducing kernel Hilbert space framework for inverse scattering problems within the Born approximation. Inverse Problems and Imaging, 2019, 13 (6) : 1327-1348. doi: 10.3934/ipi.2019058

[3]

Ying Lin, Rongrong Lin, Qi Ye. Sparse regularized learning in the reproducing kernel banach spaces with the $ \ell^1 $ norm. Mathematical Foundations of Computing, 2020, 3 (3) : 205-218. doi: 10.3934/mfc.2020020

[4]

Ali Akgül, Mustafa Inc, Esra Karatas. Reproducing kernel functions for difference equations. Discrete and Continuous Dynamical Systems - S, 2015, 8 (6) : 1055-1064. doi: 10.3934/dcdss.2015.8.1055

[5]

Bernd Hofmann, Barbara Kaltenbacher, Elena Resmerita. Lavrentiev's regularization method in Hilbert spaces revisited. Inverse Problems and Imaging, 2016, 10 (3) : 741-764. doi: 10.3934/ipi.2016019

[6]

Simone Creo, Maria Rosaria Lancia, Alejandro Vélez-Santiago, Paola Vernole. Approximation of a nonlinear fractal energy functional on varying Hilbert spaces. Communications on Pure and Applied Analysis, 2018, 17 (2) : 647-669. doi: 10.3934/cpaa.2018035

[7]

Matthew O. Williams, Clarence W. Rowley, Ioannis G. Kevrekidis. A kernel-based method for data-driven koopman spectral analysis. Journal of Computational Dynamics, 2015, 2 (2) : 247-265. doi: 10.3934/jcd.2015005

[8]

Irene Benedetti, Luisa Malaguti, Valentina Taddei. Nonlocal problems in Hilbert spaces. Conference Publications, 2015, 2015 (special) : 103-111. doi: 10.3934/proc.2015.0103

[9]

Fritz Gesztesy, Rudi Weikard, Maxim Zinchenko. On a class of model Hilbert spaces. Discrete and Continuous Dynamical Systems, 2013, 33 (11&12) : 5067-5088. doi: 10.3934/dcds.2013.33.5067

[10]

Matthias Gerdts, Stefan Horn, Sven-Joachim Kimmerle. Line search globalization of a semismooth Newton method for operator equations in Hilbert spaces with applications in optimal control. Journal of Industrial and Management Optimization, 2017, 13 (1) : 47-62. doi: 10.3934/jimo.2016003

[11]

Emeka Chigaemezu Godwin, Adeolu Taiwo, Oluwatosin Temitope Mewomo. Iterative method for solving split common fixed point problem of asymptotically demicontractive mappings in Hilbert spaces. Numerical Algebra, Control and Optimization, 2022  doi: 10.3934/naco.2022005

[12]

Zhiming Li, Yujun Zhu. Entropies of commuting transformations on Hilbert spaces. Discrete and Continuous Dynamical Systems, 2020, 40 (10) : 5795-5814. doi: 10.3934/dcds.2020246

[13]

Junying Hu, Xiaofei Qian, Jun Pei, Changchun Tan, Panos M. Pardalos, Xinbao Liu. A novel quality prediction method based on feature selection considering high dimensional product quality data. Journal of Industrial and Management Optimization, 2021  doi: 10.3934/jimo.2021099

[14]

Haili Qiao, Aijie Cheng. A fast high order method for time fractional diffusion equation with non-smooth data. Discrete and Continuous Dynamical Systems - B, 2022, 27 (2) : 903-920. doi: 10.3934/dcdsb.2021073

[15]

Deren Han, Zehui Jia, Yongzhong Song, David Z. W. Wang. An efficient projection method for nonlinear inverse problems with sparsity constraints. Inverse Problems and Imaging, 2016, 10 (3) : 689-709. doi: 10.3934/ipi.2016017

[16]

Adeolu Taiwo, Lateef Olakunle Jolaoso, Oluwatosin Temitope Mewomo. Viscosity approximation method for solving the multiple-set split equality common fixed-point problems for quasi-pseudocontractive mappings in Hilbert spaces. Journal of Industrial and Management Optimization, 2021, 17 (5) : 2733-2759. doi: 10.3934/jimo.2020092

[17]

Francis Akutsah, Akindele Adebayo Mebawondu, Hammed Anuoluwapo Abass, Ojen Kumar Narain. A self adaptive method for solving a class of bilevel variational inequalities with split variational inequality and composed fixed point problem constraints in Hilbert spaces. Numerical Algebra, Control and Optimization, 2021  doi: 10.3934/naco.2021046

[18]

Shaotao Hu, Yuanheng Wang, Bing Tan, Fenghui Wang. Inertial iterative method for solving variational inequality problems of pseudo-monotone operators and fixed point problems of nonexpansive mappings in Hilbert spaces. Journal of Industrial and Management Optimization, 2022  doi: 10.3934/jimo.2022060

[19]

Igor Griva, Roman A. Polyak. Proximal point nonlinear rescaling method for convex optimization. Numerical Algebra, Control and Optimization, 2011, 1 (2) : 283-299. doi: 10.3934/naco.2011.1.283

[20]

Liam Burrows, Weihong Guo, Ke Chen, Francesco Torella. Reproducible kernel Hilbert space based global and local image segmentation. Inverse Problems and Imaging, 2021, 15 (1) : 1-25. doi: 10.3934/ipi.2020048

2020 Impact Factor: 1.916

Metrics

  • PDF downloads (263)
  • HTML views (74)
  • Cited by (0)

Other articles
by authors

[Back to Top]