Citation: |
[1] |
J. O. Berger, "Statistical Decision Theory and Bayesian Analysis," Second edition, Springer Series in Statistics, Springer-Verlag, New York, 1985. |
[2] |
P. J. Bickel and K. A. Doksum, "Mathematical Statistics. Basic Ideas and Selected Topics," Second edition, Prentice-Hall, Inc., 2001. |
[3] |
C. J. C. Burges, A tutorial on support vector machines for pattern recognition, Data Mining and Knowledge Discovery, 2 (1998), 121-167. |
[4] |
O. Chapelle, V. Vapnik, O. Bousquet and S. Mukherjee, Choosing multiple parameters for support vector machines, Machine Learning, 46 (2002), 131-159. |
[5] |
B. Chen and P. T. Harker, Smooth approximations to nonlinear complementarity problems, SIAM J. Optimization, 7 (1997), 403-420. |
[6] |
C. Chen and O. L. Mangasarian, A class of smoothing functions for nonlinear and mixed complementarity problems, Computational Optimization and Applications, 5 (1996), 97-138. |
[7] |
C. Chen and O. L. Mangasarian, Smoothing methods for convex inequalities and linear complementarity problems, Math. Programming, 71 (1995), 51-69.doi: 10.1016/0025-5610(95)00005-4. |
[8] |
X. Chen, L. Qi and D. Sun, Global and superlinear convergence of the smoothing newton method and its application to general box constrained variational inequalities, Math. of Computation, 67 (1998), 519-540. |
[9] |
X. Chen and Y. Ye, On homotopy-smoothing methods for variational inequalities, SIAM J. Control and Optimization, 37 (1999), 589-616.doi: 10.1137/S0363012997315907. |
[10] |
G. P. Crespi, I. Ginchev and M. Rocca, Two approaches toward constrained vector optimization and identity of the solutions, Journal of Industrial and Management Optimization, 1 (2005), 549-563. |
[11] |
G. W. Flake and L. Steve, Efficient SVM regression training with SMO, Machine Learning, 46 (2002), 271-290. |
[12] |
K. Fukunaga, "Introduction to Statistical Pattern Recognition," Second edition, Computer Science and Scientific Computing, Academic Press, Inc., Boston, MA, 1990. |
[13] |
G. Fung and O. L. Mangasarian, Finite Newton method for Lagrangian support vector machine classification, Neurocomputing, 55 (2003), 39-55. |
[14] |
D. Y. Gao, Canonical dual transformation method and generalized triality theory in nonsmooth global optimization, J. Global Optimization, 17 (2000), 127-160. |
[15] |
D. Y. Gao, Perfect duality theory and complete set of solutions to a class of global optimization, Optimization, 52 (2003), 467-493.doi: 10.1080/02331930310001611501. |
[16] |
D. Y. Gao, Complete solutions to constrained quadratic optimization problems, Journal of Global Optimisation, 29 (2004), 377-399.doi: 10.1023/B:JOGO.0000048034.94449.e3. |
[17] |
D. Y. Gao, Sufficient conditions and perfect duality in nonconvex minimization with inequality constraints, Journal of Industrial and Management Optimization, 1 (2005), 53-63. |
[18] |
D. Y. Gao, Complete solutions and extremality criteria to polynomial optimization problems, Journal of Global Optimization, 35 (2006), 131-143.doi: 10.1007/s10898-005-3068-5. |
[19] |
L. Gonzalez, C. Angulo, F. Velasco and A. Catala, Dual unification of bi-class support vector machine formulations, Pattern Recognition, 39 (2006), 1325-1332. |
[20] |
A. G. Hadigheh and T. Terlaky, Generalized support set invariancy sensitivity analysis in linear optimization, Journal of Industrial and Management Optimization, 2 (2006), 1-18. |
[21] |
Q. He, Z.-Z. Shi, L.-A. Ren and E. S. Lee, A novel classification method based on hyper-surface, Mathematical and Computer Modelling, 38 (2003), 395-407.doi: 10.1016/S0895-7177(03)90096-3. |
[22] |
C. W. Hsu and C. J. Lin, A simple decomposition method for support vector machines, Machine Learning, 46 (2002), 291-314. |
[23] |
T. Joachims, Making large-scale support vector machine learning practical, in "Advanced in Kernel methods: Support Vector Learning," (eds. B. Scholkopf, B. Burges and A. Smola), The MIT Press, Cambridge, Massachusetts, 1999. |
[24] |
S. S. Keerthi, K. B. Duan, S. K. Shevade and A. N. Poo, A fast dual algorithm for kernel logistic regression, Machine Learning, 61 (2005), 151-165. |
[25] |
P. Laskov, Feasible direction decomposition algorithms for training support vector machines, Machine Learning, 46 (2002), 315-349. |
[26] |
Y.-J. Lee, W. F. Hsieh and C. M. Huang, $\epsilon$-SSVR: A smooth support vector machine for $\epsilon$-insensitive regression, IEEE Transaction on Knowledge and Data Engineering, 17 (2005), 678-685.doi: 10.1109/TKDE.2005.77. |
[27] |
Y.-J. Lee and O. L. Mangarasian, SSVM: A smooth support vector machine for classification, Computational Optimization and Applications, 22 (2001), 5-22. |
[28] |
O. L. Mangasarian and D. R. Musicant, Successive overrelaxation for support vector machines, IEEE Transactions on Neural Networks, 10 (1999), 1032-1037.doi: 10.1109/72.788643. |
[29] |
T. M. Mitchell, "Machine Learning," McGraw-Hill Companies, Inc., 1997. |
[30] |
T. Mitchell, Statistical Approaches to Learning and Discovery, The course of Machine Learning at CMU, 2003. |
[31] |
D. Montgomery, "Design and Analysis of Experiments," Third edition, John Wiley & Sons, Inc., New York, 1991. |
[32] |
D. J. Newman, S. Hettich, C. L. Blake and C. J. Merz, "UCI Repository of Machine Learning Databases," University of California, Department of Information and Computer Science, Irvine, CA, 1998. Available from: http://www.ics.uci.edu/~mlearn/MLRepository.html. |
[33] |
P.-F. Pai, System reliability forecasting by support vector machines with genetic algorithms, Mathematical and Computer Modelling, 43 (2006), 262-274.doi: 10.1016/j.mcm.2005.02.008. |
[34] |
N. Panda and E. Y. Chang, KDX: An Indexer for support vector machines, IEEE Transaction on Knowledge and Data Engineering, 18 (2006), 748-763.doi: 10.1109/TKDE.2006.101. |
[35] |
J. Platt, Sequential minimal optimization: A fast algorithm for training support vector machines, Advances in Kernel Methods-Support Vector Learning [R], (1999), 185-208. |
[36] |
K. Schittkowski, Optimal parameter selection in support vector machines, Journal of Industrial and Management Optimization, 1 (2005), 465-476. |
[37] |
B. Schölkoft, "Support Vector Learning," R. Oldenbourg Verlag, Munich, 1997. |
[38] |
V. Vapnik, "The Nature of Statistical Learning Theory," Springer-Verlag, New York, 1995. |
[39] |
V. Vapnik, The support vector method of function estimation NATO ASI Series, in "Neural Network and Machine Learning," (ed. C. Bishop), Springer, 1998. |
[40] |
V. Vapnik, An overview of statistical learning theory, in "Advanced in Kernel methods: Support Vector Learning" (eds. B. Scholkopf, B. Burges and A. Smola), The MIT Press, Cambridge, Massachusetts, 1999. |
[41] |
V. Vapnik, Three remarks on support vector function estimation, IEEE transactions on Neural Networks, 10 (1999), 988-1000. |
[42] |
Z. Y. Wu, H. W. J. Lee, F. S. Bai and L. S. Zhang, Quadratic smoothing approximation to $l_1$ exact penalty function in global optimization, Journal of Industrial and Management Optimization, 1 (2005), 533-547. |
[43] |
K. F. C. Yiu, K. L. Mak and K. L. Teo, Airfoil design via optimal control theory, Journal of Industrial and Management Optimization, 1 (2005), 133-148. |
[44] |
Y. Yuan, J. Yan and C. Xu, Polynomial smooth support vector machine (PSSVM), Chinese Journal Of Computers, 28 (2005), 9-17. |
[45] |
Y. Yuan and T. Huang, A polynomial smooth support vector machine for classification, Lecture Note in Artificial Intelligence, 3584 (2005), 157-164. |
[46] |
Y. Yuan and R. Byrd, Non-quasi-Newton updates for unconstrained optimization, J. Comput. Math., 13 (1995), 95-107. |
[47] |
Y. Yuan, A modified BFGS algorithm for unconstrained optimization, IMA J. Numer. Anal., 11 (1991), 325-332.doi: 10.1093/imanum/11.3.325. |