# American Institute of Mathematical Sciences

May  2019, 2(2): 169-181. doi: 10.3934/mfc.2019012

## An RKHS approach to estimate individualized treatment rules based on functional predictors

 1 Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong, China 2 School of Mathematical Sciences, Shanghai Key Laboratory for Contemporary Applied Mathematics, Fudan University, Shanghai, 200433, China

* Corresponding author: Lei Shi

Published  July 2019

In recent years there has been massive interest in precision medicine, which aims to tailor treatment plans to the individual characteristics of each patient. This paper studies the estimation of individualized treatment rules (ITR) based on functional predictors such as images or spectra. We consider a reproducing kernel Hilbert space (RKHS) approach to learn the optimal ITR which maximizes the expected clinical outcome. The algorithm can be conveniently implemented although it involves infinite-dimensional functional data. We provide convergence rate for prediction under mild conditions, which is jointly determined by both the covariance kernel and the reproducing kernel.

Citation: Jun Fan, Fusheng Lv, Lei Shi. An RKHS approach to estimate individualized treatment rules based on functional predictors. Mathematical Foundations of Computing, 2019, 2 (2) : 169-181. doi: 10.3934/mfc.2019012
##### References:

show all references

##### References:
 [1] Kaitlyn (Voccola) Muller. A reproducing kernel Hilbert space framework for inverse scattering problems within the Born approximation. Inverse Problems & Imaging, 2019, 13 (6) : 1327-1348. doi: 10.3934/ipi.2019058 [2] Ying Lin, Rongrong Lin, Qi Ye. Sparse regularized learning in the reproducing kernel banach spaces with the $\ell^1$ norm. Mathematical Foundations of Computing, 2020  doi: 10.3934/mfc.2020020 [3] Ali Akgül, Mustafa Inc, Esra Karatas. Reproducing kernel functions for difference equations. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1055-1064. doi: 10.3934/dcdss.2015.8.1055 [4] Nigel Higson and Gennadi Kasparov. Operator K-theory for groups which act properly and isometrically on Hilbert space. Electronic Research Announcements, 1997, 3: 131-142. [5] Alex Castro, Wyatt Howard, Corey Shanbrom. Complete spelling rules for the Monster tower over three-space. Journal of Geometric Mechanics, 2017, 9 (3) : 317-333. doi: 10.3934/jgm.2017013 [6] Baohuai Sheng, Huanxiang Liu, Huimin Wang. Learning rates for the kernel regularized regression with a differentiable strongly convex loss. Communications on Pure & Applied Analysis, 2020, 19 (8) : 3973-4005. doi: 10.3934/cpaa.2020176 [7] G. Calafiore, M.C. Campi. A learning theory approach to the construction of predictor models. Conference Publications, 2003, 2003 (Special) : 156-166. doi: 10.3934/proc.2003.2003.156 [8] Daniel Alpay, Mihai Putinar, Victor Vinnikov. A Hilbert space approach to bounded analytic extension in the ball. Communications on Pure & Applied Analysis, 2003, 2 (2) : 139-145. doi: 10.3934/cpaa.2003.2.139 [9] Anna Karczewska, Carlos Lizama. On stochastic fractional Volterra equations in Hilbert space. Conference Publications, 2007, 2007 (Special) : 541-550. doi: 10.3934/proc.2007.2007.541 [10] Onur Alp İlhan. Solvability of some partial integral equations in Hilbert space. Communications on Pure & Applied Analysis, 2008, 7 (4) : 837-844. doi: 10.3934/cpaa.2008.7.837 [11] Mahmoud M. El-Borai. On some fractional differential equations in the Hilbert space. Conference Publications, 2005, 2005 (Special) : 233-240. doi: 10.3934/proc.2005.2005.233 [12] P. Chiranjeevi, V. Kannan, Sharan Gopal. Periodic points and periods for operators on hilbert space. Discrete & Continuous Dynamical Systems - A, 2013, 33 (9) : 4233-4237. doi: 10.3934/dcds.2013.33.4233 [13] Simone Creo, Maria Rosaria Lancia, Alejandro Vélez-Santiago, Paola Vernole. Approximation of a nonlinear fractal energy functional on varying Hilbert spaces. Communications on Pure & Applied Analysis, 2018, 17 (2) : 647-669. doi: 10.3934/cpaa.2018035 [14] Yifan Xia, Yongchao Hou, Xin He, Shaogao Lv. Learning rates for partially linear functional models with high dimensional scalar covariates. Communications on Pure & Applied Analysis, 2020, 19 (8) : 3917-3932. doi: 10.3934/cpaa.2020172 [15] Yuanyao Ding, Zudi Lu. How's the performance of the optimized portfolios by safety-first rules: Theory with empirical comparisons. Journal of Industrial & Management Optimization, 2019  doi: 10.3934/jimo.2019076 [16] Carla Mascia, Giancarlo Rinaldo, Massimiliano Sala. Hilbert quasi-polynomial for order domains and application to coding theory. Advances in Mathematics of Communications, 2018, 12 (2) : 287-301. doi: 10.3934/amc.2018018 [17] Tieliang Gong, Qian Zhao, Deyu Meng, Zongben Xu. Why curriculum learning & self-paced learning work in big/noisy data: A theoretical perspective. Big Data & Information Analytics, 2016, 1 (1) : 111-127. doi: 10.3934/bdia.2016.1.111 [18] Jiang Xie, Junfu Xu, Celine Nie, Qing Nie. Machine learning of swimming data via wisdom of crowd and regression analysis. Mathematical Biosciences & Engineering, 2017, 14 (2) : 511-527. doi: 10.3934/mbe.2017031 [19] Xiangmin Zhang. User perceived learning from interactive searching on big medical literature data. Big Data & Information Analytics, 2018  doi: 10.3934/bdia.2017019 [20] Andreas Chirstmann, Qiang Wu, Ding-Xuan Zhou. Preface to the special issue on analysis in machine learning and data science. Communications on Pure & Applied Analysis, 2020, 19 (8) : i-iii. doi: 10.3934/cpaa.2020171

Impact Factor: