July  2022, 18(4): 2977-3000. doi: 10.3934/jimo.2021099

A novel quality prediction method based on feature selection considering high dimensional product quality data

1. 

School of Management, Hefei University of Technology, Hefei 230009, China

2. 

Center for Applied Optimization, Department of Industrial and Systems, Engineering University of Florida, Gainesville, FL 32611, USA

3. 

Key Laboratory of Process Optimization and Intelligent Decision-making, of the Ministry of Education, Hefei 230009, China

4. 

School of Economics, Hefei University of Technology, Hefei 230009, China

5. 

Ministry of Education Engineering Research Center for Intelligent Decision-Making & Information System Technologies, Hefei 230009, China

* Corresponding author: Xiaofei Qian, Xinbao Liu

Received  December 2019 Revised  January 2021 Published  July 2022 Early access  May 2021

Product quality is the lifeline of enterprise survival and development. With the rapid development of information technology, the semiconductor manufacturing process produces multitude of quality features. Due to the increasing quality features, the requirement on the training time and classification accuracy of quality prediction methods becomes increasingly higher. Aiming at realizing the quality prediction for semiconductor manufacturing process, this paper proposes a modified support vector machine (SVM) model based on feature selection, considering the high dimensional and nonlinear characteristics of data. The model first improves the Radial Basis Function (RBF) in SVM, and then combines the Duelist algorithm (DA) and variable neighborhood search algorithm (VNS) for feature selection and parameters optimization. Compared with some other SVM models that are based on DA, genetic algorithm (GA), and Information Gain algorithm (IG), the experiment results show that our DA-VNS-SVM can obtain higher classification accuracy rate with a smaller feature subset. In addition, we compare the DA-VNS-SVM with some common machine learning algorithms such as logistic regression, naive Bayes, decision tree, random forest, and artificial neural network. The results indicate that our model outperform these machine learning algorithms for the quality prediction of semiconductor.

Citation: Junying Hu, Xiaofei Qian, Jun Pei, Changchun Tan, Panos M. Pardalos, Xinbao Liu. A novel quality prediction method based on feature selection considering high dimensional product quality data. Journal of Industrial and Management Optimization, 2022, 18 (4) : 2977-3000. doi: 10.3934/jimo.2021099
References:
[1]

M. M. Adankon and M. Cheriet, Model selection for the LS-SVM. Application to handwriting recognition, Pattern Recognition, 42 (2009), 3264-3270.  doi: 10.1016/j.patcog.2008.10.023.

[2]

S. Agatonovic-Kustrin and R. Beresford, Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research, Journal of Pharmaceutical & Biomedical Analysis, 22 (2000), 717-727.  doi: 10.1016/S0731-7085(99)00272-1.

[3]

M. A. Ahmadi and A. Bahadori, A LSSVM approach for determining well placement and conning phenomena in horizontal wells, Fuel, 153 (2015), 276-283.  doi: 10.1016/j.fuel.2015.02.094.

[4]

M. Al-Kharaz, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Quality prediction in semiconductor manufacturing processes using multilayer perceptron feedforward artificial neural network, 2019 8th International Conference on Systems and Control (ICSC), (2019), 423–428. doi: 10.1109/ICSC47195.2019.8950664.

[5]

D. Alagić, O. Bluder and J. Pilz, Quantification and prediction of damage in SAM images of semiconductor devices, International Conference Image Analysis and Recognition, Lecture Notes in Computer Science, 10882, Springer, Cham, 2018,490–496. doi: 10.1007/978-3-319-93000-8_55.

[6]

E. Alba, J. Garcia-Nieto, L. Jourdan and E. Talbi, Gene selection in cancer classification using GPSO/SVM and GA/SVM hybrid algorithms, IEEE Congress on Evolutionary Computation, IEEE, (2008), 284–290. doi: 10.1109/CEC.2007.4424483.

[7]

N. Allias, M. N. M. M. Noor, M. N. Ismail, K. de Silva, A hybrid Gini PSO-SVM feature selection: An empirical study of population sizes on different classifier, International Conference on Artificial Intelligence, (2013), 107–110. doi: 10.1109/AIMS.2013.24.

[8]

, Asuncion A, Newman DJ. UCI Machine Learning Repository, 2007.

[9]

H. BaeS. S. KimK. B. WooG. S. May and D. K. Lee, Fault detection, diagnosis, and optimization of wafer manufacturing processes utilizing knowledge creation, International Journal of Control, Automation, and Systems, 4 (2006), 372-381. 

[10]

M. G. BariX. Ma and J. Zhang, PeakLink: A new peptide peak linking method in LC-MS/MS using wavelet and SVM, Bioinformatics, 30 (2014), 2464-2470.  doi: 10.1093/bioinformatics/btu299.

[11]

A. Bazzani, A. Bevilacqua, D. Bollini, et al., An SVM classifier to separate false signals from microcalcifications in digital mammograms, Physics in Medicine and Biology, 46 (2001), 1651–1651. doi: 10.1088/0031-9155/46/6/305.

[12]

T. R. Biyanto, et al., Duelist algorithm: An algorithm inspired by how duelist improve their capabilities in a duel, The Seventh International Conference on Swarm Intelegence, 2016, 39–47. doi: 10.1007/978-3-319-41000-5_4.

[13]

B. Bonev, Feature selection based on information theory, Universidad de Alicante, 2010. Available from: http://hdl.handle.net/10045/18362.

[14]

W. Chen, Z. Li and J. Guo, A VNS-EDA algorithm-based feature selection for credit risk classification, Mathematical Problems in Engineering, 2020 (2020), 14 pp. doi: 10.1155/2020/4515480.

[15]

V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, 17 (2004), 113-126.  doi: 10.1016/S0893-6080(03)00169-2.

[16]

P.-H. ChouM-J. Wu and K.-K. Chen, Integrating support vector machine and genetic algorithm to implement dynamic wafer quality prediction system, Expert Systems with Applications, 37 (2010), 4413-4424.  doi: 10.1016/j.eswa.2009.11.087.

[17]

C. Cortes and V. Vapnik, Support-Vector Networks, Machine Learning, 20 (1995), 273-297.  doi: 10.1007/BF00994018.

[18]

H. CostaL. R. GalvaoL. H. C. Merschmann and M. J. F. Souza, A VNS algorithm for feature selection in hierarchical classification context, Electronic Notes in Discrete Mathematics, 66 (2018), 79-86.  doi: 10.1016/j.endm.2018.03.011.

[19]

N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, Cambridge University Press, 2000.

[20]

K. Fridgeirsdottir, R. Akella, L.-M. Al, Statistical methodology for yield enhancement via baseline reduction, Advanced Semiconductor Manufacturing Conference & Workshop, (1998), 77–81. doi: 10.1109/ASMC.1998.731402.

[21]

J. DerracC. CornelisS. García and F. Herrera, Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection, Information Sciences, 186 (2012), 73-92.  doi: 10.1016/j.ins.2011.09.027.

[22]

S. Dong, Y. Zhang, Z. He, et al., Investigation of support vector machine and back propagation artificial neural network for performance prediction of the organic Rankine cycle system, Energy, 144 (2018), 851–864. doi: 10.1016/j.energy.2017.12.094.

[23]

R. DongJ. Xu and B. Lin, ROI-based study on impact factors of distributed PV projects by LSSVM-PSO, Energy, 124 (2017), 336-349.  doi: 10.1016/j.energy.2017.02.056.

[24]

A. L. Ellefsen, E. Bjorlykhaug, V. Esoy, et al., Remaining useful life predictions for turbofan engine degradation using semi-supervised deep architecture, Reliability Engineering & System Safety, 183 (2019), 240-251.

[25]

A. O. Filho, A. C. Silva, A. C. de Paiva, et al., Computer-aided diagnosis of lung nodules in computed tomography by using phylogenetic diversity, genetic algorithm, and SVM, Journal of Digital Imaging, 30 (2017), 812–822. doi: 10.1007/s10278-017-9973-6.

[26]

B. FrénayG. Doquire and M. Verleysen, Is mutual information adequate for feature selection in regression, Neural Networks, 48 (2013), 1-7.  doi: 10.1016/j.neunet.2013.07.003.

[27]

R. Fujimaki, T. Yairi and K. Machida, An anomaly detection method for spacecraft using relevance vector learning, Pacific-Asia Conference on Knowledge Discovery & Data Mining, Lecture Notes in Computer Science, 3518, Springer, Berlin, Heidelberg, 2005,785–790. doi: 10.1007/11430919_92.

[28]

M. Garcia-Torres, F. C. Garcia-López, B. Melián-Batista, J. A. Moreno-Pérez and J. M. Moreno-Vega, Solving feature subset selection problem by a hybrid, Hybrid Metaheuristics, (2004), 59–68.

[29]

M. Garcia-Torres, R. Armananzas, C. Bielza, et al., Comparison of metaheuristic strategies for peakbin selection in proteomic mass spectrometry data, Information Sciences, 222 (2013), 229–246. doi: 10.1016/j.ins.2010.12.013.

[30]

M. García-TorresF. Gómez-VelaB. Melián-Batista and M. Moreno-Vega, High-dimensional feature selection via feature grouping: A variable neighborhood search approach, Information Sciences, 326 (2016), 102-118.  doi: 10.1016/j.ins.2015.07.041.

[31]

J. HuaW. D. Tembe and E. R. Dougherty, Performance of feature-selection methods in the classification of high-dimension data, Pattern Recognition, 42 (2009), 409-424.  doi: 10.1016/j.patcog.2008.08.001.

[32]

C.-L. Huang and J.-F. Dun, A distributed PSO-SVM hybrid system with feature selection and parameter optimization, Applied Soft Computing, 8 (2008), 1381-1391.  doi: 10.1016/j.asoc.2007.10.007.

[33]

P. Janik and T. Lobos, Automated classification of power-quality disturbances using SVM and RBF networks, IEEE Transactions on Power Delivery, 21 (2006), 1663-1669.  doi: 10.1109/TPWRD.2006.874114.

[34]

A. C. Janssens, Y. Deng, G. J. Borsboom, et al., A new logistic regression approach for the evaluation of diagnostic test results, Medical Decision Making, 25 (2005), 168–177. doi: 10.1177/0272989X05275154.

[35]

Y.-S. JeongB. Kim and Y-D. Ko, Exponentially weighted moving average-based procedure with adaptive thresholding for monitoring nonlinear profiles: Monitoring of plasma etch process in semiconductor manufacturing, Expert Systems with Applications, 40 (2013), 5688-5693.  doi: 10.1016/j.eswa.2013.04.016.

[36]

S. S. Keerthi and E. G. Gilbert, Convergence of a generalized SMO algorithm for SVM classifier design, Machine Learning, 46 (2002), 351-360.  doi: 10.1023/A:1012431217818.

[37]

S. J. Kim, H. G. Yoon, K. B. Lee, et al., Hybrid overlay modeling for field-by-field error correction in the photolithography process, IEEE Transactions on Semiconductor Manufacturing, 33 (2020), 53–61. doi: 10.1109/TSM.2019.2957508.

[38]

B. KimD. W. Kim and G. T. Park, Prediction of plasma etching using a polynomial neural network, IEEE Transactions on Plasma Science, 31 (2003), 1330-1336.  doi: 10.1109/TPS.2003.820681.

[39]

T. KolodziejczykR. ToscanoS. Fouvry and G. Morales-Espejel, Artificial intelligence as efficient technique for ball bearing fretting wear damage prediction, Wear, 268 (2010), 309-315.  doi: 10.1016/j.wear.2009.08.016.

[40]

M. Kumar, M. Bhasin, N. K. Natt, et al. BhairPred: prediction of β-hairpins in a protein from multiple alignment information using ANN and SVM techniques, Nucleic Acids Research, 33 (2015), 154–159. doi: 10.1093/nar/gki588.

[41]

Y.-X. LaiC-F. LaiY-M. Huang and H.-C. Chao, Multi-appliance recognition system with hybrid SVM/GMM classifier in ubiquitous smart home, Information Sciences, 230 (2013), 39-55.  doi: 10.1016/j.ins.2012.10.002.

[42]

M. Last and A. Kandel, Data Mining for Process and Quality Control in the Semiconductor Industry, 3, Data Mining for Design and Manufacturing, Springer, Boston, MA, 2001. doi: 10.1007/978-1-4757-4911-3_9.

[43]

H. LiC.-J. LiX.-J. Wu and J. Sun, Statistics-based wrapper for feature selection: An implementation on financial distress identification with support vector machine, Applied Soft Computing, 19 (2014), 57-67.  doi: 10.1016/j.asoc.2014.01.018.

[44]

S. LiH. WuD. Wan and J. Zhu, An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine, Knowledge-Based Systems, 24 (2011), 40-48.  doi: 10.1016/j.knosys.2010.07.003.

[45]

Y. Li and X. Zhang, Diffusion maps based k-nearest-neighbor rule technique for semiconductor manufacturing process fault detection, Chemometrics and Intelligent Laboratory Systems, 136 (2014), 47-57.  doi: 10.1016/j.chemolab.2014.05.003.

[46]

H. Liu and H. Motoda, Feature extraction construction and selection: A data mining perspective, Journal of the American Statistical Association, 94 (1999), 014004.

[47]

J. LongS. Zhang and C. Li, Evolving deep echo state networks for intelligent fault diagnosis, IEEE Transactions on Industrial Informatics, 16 (2020), 4928-4937.  doi: 10.1109/TII.2019.2938884.

[48]

J. LongZ. SunC. LiY. HongY. Bai and S. Zhang, A novel sparse echo autoencoder network for data-driven fault diagnosis of delta 3-D printers, IEEE Transactions on Instrumentation and Measurement, 69 (2020), 683-692.  doi: 10.1109/TIM.2019.2905752.

[49]

U. MaulikA. Mukhopadhyay and D. Chakraborty, Gene-expression-based cancer subtypes prediction through feature selection and transductive SVM, IEEE Transactions on Biomedical Engineering, 60 (2013), 1111-1117.  doi: 10.1109/TBME.2012.2225622.

[50]

M. Melhem, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Product quality prediction using alarm data : Application to the semiconductor manufacturing process, 2017 25th Mediterranean Conference on Control and Automation (MED), (2017), 1332–1338. doi: 10.1109/MED.2017.7984303.

[51]

L. Mönch, J. W. Fowler and S. J. Mason, Production Planning and Control for Semiconductor Wafer Fabrication Facilities: Modeling, Analysis, and Systems, 52, Springer Science & Business Media, Springer, New York, 2012.

[52]

A. Mucherino and L. Liberti, A VNS-based heuristic for feature selection in data mining, Hybrid Metaheuristics, Studies in Computational Intelligence, 434, Springer, Berlin, Heidelberg, 2013,353–368. doi: 10.1007/978-3-642-30671-6_13.

[53]

J. NeumannC. Schnorr and G. Steidl, Combined SVM-based feature selection and classification, Machine Learning, 61 (2005), 129-150.  doi: 10.1007/s10994-005-1505-9.

[54]

Y. Oh, K. Ransikarbum, M. Busogi, et al., Adaptive SVM-based real-time quality assessment for primer-sealer dispensing process of sunroof assembly line, Reliability Engineering & System Safety, 184 (2019), 202–212. doi: 10.1016/j.ress.2018.03.020.

[55]

A. V. PhanM. L. Nguyen and L. T. Bui, Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems, Appl. Intell., 46 (2017), 455-469.  doi: 10.1007/s10489-016-0843-6.

[56]

H. Purwins, et al., Regression methods for prediction of PECVD silicon nitride layer thickness, 2011 IEEE International Conference on Automation Science and Engineering, (2011), 387–392. doi: 10.1109/CASE.2011.6042426.

[57]

J. R. Quinlan, Induction on decision tree, Machine Learning, 1 (1986), 81-106.  doi: 10.1007/BF00116251.

[58]

M. S. RahmanM. K. RahmanM. Kaykobad and M. S. Rahman, isGPT: An optimized model to identify sub-Golgi protein types using SVM and random forest based feature selection, Artificial Intelligence in Medicine, 84 (2017), 90-100.  doi: 10.1016/j.artmed.2017.11.003.

[59]

I. Rish, An empirical study of the naive Bayes classifier, IJCAI 2001 Workshop on Empirical Methods in Artificial Intelligence, 3 (2001), 41-46. 

[60]

Y-C. SuM.-H. HungF.-T. Cheng and Y-T. Chen, A processing quality prognostics scheme for plasma sputtering in TFT-LCD manufacturing, IEEE Transactions on Semiconductor Manufacturing, 19 (2006), 183-194.  doi: 10.1109/TSM.2006.873514.

[61]

J. B. TenenbaumV. D. Silva and J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science, 290 (2000), 2319-2323.  doi: 10.1126/science.290.5500.2319.

[62]

A. UnlerA. Murat and R. B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Information Sciences, 181 (2011), 4625-4641.  doi: 10.1016/j.ins.2010.05.037.

[63]

Y. Wang, Z. Xue, G. Shen, et al., PRINTR: Prediction of RNA binding sites in proteins using SVM and profiles, Amino Acids, 35 (2008), 295–302. doi: 10.1007/s00726-007-0634-9.

[64]

R. Weber and J. Basak, Simultaneous feature selection and classification using kernel-penalized support vector machines, Information Sciences, 181 (2011), 115-128.  doi: 10.1016/j.ins.2010.08.047.

[65]

C. Wei, J. Chen, Z. Song and C. Chen, Soft sensors of nonlinear industrial processes based on self-learning kernel regression model, Asian Control Conference, IEEE, (2018), 1783–1788. doi: 10.1109/ASCC.2017.8287444.

[66]

X. WuL. ChenS. Pang and X. Ding, A paratactic subjective-objective weighting methods and SVM risk assessment model applied in textile and apparel safety, International Journal of Quality & Reliability Management, 32 (2015), 472-485.  doi: 10.1108/IJQRM-06-2013-0102.

[67]

Y. Xiang and L. Jiang, Water quality prediction using LS-SVM and particle swarm optimization, 2009 Second International Workshop on Knowledge Discovery and Data Mining, (2009), 900–904. doi: 10.1109/WKDD.2009.217.

[68]

S. D. Xin and C. B. Zhong, Some practical application of sequential analysis to the fault prediction system of a main diesel engine, Conference of the IEEE Industrial Electronics Society, IEEE, 3 (2002), 2151-2156.  doi: 10.1109/IECON.1991.239009.

[69]

H. Zhang, Q. Li, Z. Sun and Y. Liu, Combining data-driven and model-driven methods for robust facial landmark detection, IEEE Transactions on Information Forensics and Security, (2016), 2409–2422. doi: 10.1109/TIFS.2018.2800901.

[70]

H. Zhang, Q.-Y. Chen, M-L. Xiang, et al., In silico prediction of mitochondrial toxicity by using GA-CG-SVM approach, Toxicology in Vitro, 23 (2009), 134–140. doi: 10.1016/j.tiv.2008.09.017.

[71]

C. Zheng and L. Jiao, Automatic parameters selection for SVM based on GA, Intelligent Control & Automation, 2 (2004), 1869-1872.  doi: 10.1109/WCICA.2004.1341000.

[72]

X. Zhao, E. K. Wong, Y. Wang, et al., A support vector machine (SVM) for predicting preferred treatment position in radiotherapy of patients with breast cancer, Medical Physics, 37 (2010), 5341–5350. doi: 10.1118/1.3483264.

[73]

Y. Zhu, Y. Tan, Y. Hua, et al., Feature selection and performance evaluation of support vector machine (SVM)-based classifier for differentiating benign and malignant pulmonary nodules by computed tomography, Journal of Digital Imaging, 23 (2010), 51–65. doi: 10.1007/s10278-009-9185-9.

show all references

References:
[1]

M. M. Adankon and M. Cheriet, Model selection for the LS-SVM. Application to handwriting recognition, Pattern Recognition, 42 (2009), 3264-3270.  doi: 10.1016/j.patcog.2008.10.023.

[2]

S. Agatonovic-Kustrin and R. Beresford, Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research, Journal of Pharmaceutical & Biomedical Analysis, 22 (2000), 717-727.  doi: 10.1016/S0731-7085(99)00272-1.

[3]

M. A. Ahmadi and A. Bahadori, A LSSVM approach for determining well placement and conning phenomena in horizontal wells, Fuel, 153 (2015), 276-283.  doi: 10.1016/j.fuel.2015.02.094.

[4]

M. Al-Kharaz, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Quality prediction in semiconductor manufacturing processes using multilayer perceptron feedforward artificial neural network, 2019 8th International Conference on Systems and Control (ICSC), (2019), 423–428. doi: 10.1109/ICSC47195.2019.8950664.

[5]

D. Alagić, O. Bluder and J. Pilz, Quantification and prediction of damage in SAM images of semiconductor devices, International Conference Image Analysis and Recognition, Lecture Notes in Computer Science, 10882, Springer, Cham, 2018,490–496. doi: 10.1007/978-3-319-93000-8_55.

[6]

E. Alba, J. Garcia-Nieto, L. Jourdan and E. Talbi, Gene selection in cancer classification using GPSO/SVM and GA/SVM hybrid algorithms, IEEE Congress on Evolutionary Computation, IEEE, (2008), 284–290. doi: 10.1109/CEC.2007.4424483.

[7]

N. Allias, M. N. M. M. Noor, M. N. Ismail, K. de Silva, A hybrid Gini PSO-SVM feature selection: An empirical study of population sizes on different classifier, International Conference on Artificial Intelligence, (2013), 107–110. doi: 10.1109/AIMS.2013.24.

[8]

, Asuncion A, Newman DJ. UCI Machine Learning Repository, 2007.

[9]

H. BaeS. S. KimK. B. WooG. S. May and D. K. Lee, Fault detection, diagnosis, and optimization of wafer manufacturing processes utilizing knowledge creation, International Journal of Control, Automation, and Systems, 4 (2006), 372-381. 

[10]

M. G. BariX. Ma and J. Zhang, PeakLink: A new peptide peak linking method in LC-MS/MS using wavelet and SVM, Bioinformatics, 30 (2014), 2464-2470.  doi: 10.1093/bioinformatics/btu299.

[11]

A. Bazzani, A. Bevilacqua, D. Bollini, et al., An SVM classifier to separate false signals from microcalcifications in digital mammograms, Physics in Medicine and Biology, 46 (2001), 1651–1651. doi: 10.1088/0031-9155/46/6/305.

[12]

T. R. Biyanto, et al., Duelist algorithm: An algorithm inspired by how duelist improve their capabilities in a duel, The Seventh International Conference on Swarm Intelegence, 2016, 39–47. doi: 10.1007/978-3-319-41000-5_4.

[13]

B. Bonev, Feature selection based on information theory, Universidad de Alicante, 2010. Available from: http://hdl.handle.net/10045/18362.

[14]

W. Chen, Z. Li and J. Guo, A VNS-EDA algorithm-based feature selection for credit risk classification, Mathematical Problems in Engineering, 2020 (2020), 14 pp. doi: 10.1155/2020/4515480.

[15]

V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, 17 (2004), 113-126.  doi: 10.1016/S0893-6080(03)00169-2.

[16]

P.-H. ChouM-J. Wu and K.-K. Chen, Integrating support vector machine and genetic algorithm to implement dynamic wafer quality prediction system, Expert Systems with Applications, 37 (2010), 4413-4424.  doi: 10.1016/j.eswa.2009.11.087.

[17]

C. Cortes and V. Vapnik, Support-Vector Networks, Machine Learning, 20 (1995), 273-297.  doi: 10.1007/BF00994018.

[18]

H. CostaL. R. GalvaoL. H. C. Merschmann and M. J. F. Souza, A VNS algorithm for feature selection in hierarchical classification context, Electronic Notes in Discrete Mathematics, 66 (2018), 79-86.  doi: 10.1016/j.endm.2018.03.011.

[19]

N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, Cambridge University Press, 2000.

[20]

K. Fridgeirsdottir, R. Akella, L.-M. Al, Statistical methodology for yield enhancement via baseline reduction, Advanced Semiconductor Manufacturing Conference & Workshop, (1998), 77–81. doi: 10.1109/ASMC.1998.731402.

[21]

J. DerracC. CornelisS. García and F. Herrera, Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection, Information Sciences, 186 (2012), 73-92.  doi: 10.1016/j.ins.2011.09.027.

[22]

S. Dong, Y. Zhang, Z. He, et al., Investigation of support vector machine and back propagation artificial neural network for performance prediction of the organic Rankine cycle system, Energy, 144 (2018), 851–864. doi: 10.1016/j.energy.2017.12.094.

[23]

R. DongJ. Xu and B. Lin, ROI-based study on impact factors of distributed PV projects by LSSVM-PSO, Energy, 124 (2017), 336-349.  doi: 10.1016/j.energy.2017.02.056.

[24]

A. L. Ellefsen, E. Bjorlykhaug, V. Esoy, et al., Remaining useful life predictions for turbofan engine degradation using semi-supervised deep architecture, Reliability Engineering & System Safety, 183 (2019), 240-251.

[25]

A. O. Filho, A. C. Silva, A. C. de Paiva, et al., Computer-aided diagnosis of lung nodules in computed tomography by using phylogenetic diversity, genetic algorithm, and SVM, Journal of Digital Imaging, 30 (2017), 812–822. doi: 10.1007/s10278-017-9973-6.

[26]

B. FrénayG. Doquire and M. Verleysen, Is mutual information adequate for feature selection in regression, Neural Networks, 48 (2013), 1-7.  doi: 10.1016/j.neunet.2013.07.003.

[27]

R. Fujimaki, T. Yairi and K. Machida, An anomaly detection method for spacecraft using relevance vector learning, Pacific-Asia Conference on Knowledge Discovery & Data Mining, Lecture Notes in Computer Science, 3518, Springer, Berlin, Heidelberg, 2005,785–790. doi: 10.1007/11430919_92.

[28]

M. Garcia-Torres, F. C. Garcia-López, B. Melián-Batista, J. A. Moreno-Pérez and J. M. Moreno-Vega, Solving feature subset selection problem by a hybrid, Hybrid Metaheuristics, (2004), 59–68.

[29]

M. Garcia-Torres, R. Armananzas, C. Bielza, et al., Comparison of metaheuristic strategies for peakbin selection in proteomic mass spectrometry data, Information Sciences, 222 (2013), 229–246. doi: 10.1016/j.ins.2010.12.013.

[30]

M. García-TorresF. Gómez-VelaB. Melián-Batista and M. Moreno-Vega, High-dimensional feature selection via feature grouping: A variable neighborhood search approach, Information Sciences, 326 (2016), 102-118.  doi: 10.1016/j.ins.2015.07.041.

[31]

J. HuaW. D. Tembe and E. R. Dougherty, Performance of feature-selection methods in the classification of high-dimension data, Pattern Recognition, 42 (2009), 409-424.  doi: 10.1016/j.patcog.2008.08.001.

[32]

C.-L. Huang and J.-F. Dun, A distributed PSO-SVM hybrid system with feature selection and parameter optimization, Applied Soft Computing, 8 (2008), 1381-1391.  doi: 10.1016/j.asoc.2007.10.007.

[33]

P. Janik and T. Lobos, Automated classification of power-quality disturbances using SVM and RBF networks, IEEE Transactions on Power Delivery, 21 (2006), 1663-1669.  doi: 10.1109/TPWRD.2006.874114.

[34]

A. C. Janssens, Y. Deng, G. J. Borsboom, et al., A new logistic regression approach for the evaluation of diagnostic test results, Medical Decision Making, 25 (2005), 168–177. doi: 10.1177/0272989X05275154.

[35]

Y.-S. JeongB. Kim and Y-D. Ko, Exponentially weighted moving average-based procedure with adaptive thresholding for monitoring nonlinear profiles: Monitoring of plasma etch process in semiconductor manufacturing, Expert Systems with Applications, 40 (2013), 5688-5693.  doi: 10.1016/j.eswa.2013.04.016.

[36]

S. S. Keerthi and E. G. Gilbert, Convergence of a generalized SMO algorithm for SVM classifier design, Machine Learning, 46 (2002), 351-360.  doi: 10.1023/A:1012431217818.

[37]

S. J. Kim, H. G. Yoon, K. B. Lee, et al., Hybrid overlay modeling for field-by-field error correction in the photolithography process, IEEE Transactions on Semiconductor Manufacturing, 33 (2020), 53–61. doi: 10.1109/TSM.2019.2957508.

[38]

B. KimD. W. Kim and G. T. Park, Prediction of plasma etching using a polynomial neural network, IEEE Transactions on Plasma Science, 31 (2003), 1330-1336.  doi: 10.1109/TPS.2003.820681.

[39]

T. KolodziejczykR. ToscanoS. Fouvry and G. Morales-Espejel, Artificial intelligence as efficient technique for ball bearing fretting wear damage prediction, Wear, 268 (2010), 309-315.  doi: 10.1016/j.wear.2009.08.016.

[40]

M. Kumar, M. Bhasin, N. K. Natt, et al. BhairPred: prediction of β-hairpins in a protein from multiple alignment information using ANN and SVM techniques, Nucleic Acids Research, 33 (2015), 154–159. doi: 10.1093/nar/gki588.

[41]

Y.-X. LaiC-F. LaiY-M. Huang and H.-C. Chao, Multi-appliance recognition system with hybrid SVM/GMM classifier in ubiquitous smart home, Information Sciences, 230 (2013), 39-55.  doi: 10.1016/j.ins.2012.10.002.

[42]

M. Last and A. Kandel, Data Mining for Process and Quality Control in the Semiconductor Industry, 3, Data Mining for Design and Manufacturing, Springer, Boston, MA, 2001. doi: 10.1007/978-1-4757-4911-3_9.

[43]

H. LiC.-J. LiX.-J. Wu and J. Sun, Statistics-based wrapper for feature selection: An implementation on financial distress identification with support vector machine, Applied Soft Computing, 19 (2014), 57-67.  doi: 10.1016/j.asoc.2014.01.018.

[44]

S. LiH. WuD. Wan and J. Zhu, An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine, Knowledge-Based Systems, 24 (2011), 40-48.  doi: 10.1016/j.knosys.2010.07.003.

[45]

Y. Li and X. Zhang, Diffusion maps based k-nearest-neighbor rule technique for semiconductor manufacturing process fault detection, Chemometrics and Intelligent Laboratory Systems, 136 (2014), 47-57.  doi: 10.1016/j.chemolab.2014.05.003.

[46]

H. Liu and H. Motoda, Feature extraction construction and selection: A data mining perspective, Journal of the American Statistical Association, 94 (1999), 014004.

[47]

J. LongS. Zhang and C. Li, Evolving deep echo state networks for intelligent fault diagnosis, IEEE Transactions on Industrial Informatics, 16 (2020), 4928-4937.  doi: 10.1109/TII.2019.2938884.

[48]

J. LongZ. SunC. LiY. HongY. Bai and S. Zhang, A novel sparse echo autoencoder network for data-driven fault diagnosis of delta 3-D printers, IEEE Transactions on Instrumentation and Measurement, 69 (2020), 683-692.  doi: 10.1109/TIM.2019.2905752.

[49]

U. MaulikA. Mukhopadhyay and D. Chakraborty, Gene-expression-based cancer subtypes prediction through feature selection and transductive SVM, IEEE Transactions on Biomedical Engineering, 60 (2013), 1111-1117.  doi: 10.1109/TBME.2012.2225622.

[50]

M. Melhem, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Product quality prediction using alarm data : Application to the semiconductor manufacturing process, 2017 25th Mediterranean Conference on Control and Automation (MED), (2017), 1332–1338. doi: 10.1109/MED.2017.7984303.

[51]

L. Mönch, J. W. Fowler and S. J. Mason, Production Planning and Control for Semiconductor Wafer Fabrication Facilities: Modeling, Analysis, and Systems, 52, Springer Science & Business Media, Springer, New York, 2012.

[52]

A. Mucherino and L. Liberti, A VNS-based heuristic for feature selection in data mining, Hybrid Metaheuristics, Studies in Computational Intelligence, 434, Springer, Berlin, Heidelberg, 2013,353–368. doi: 10.1007/978-3-642-30671-6_13.

[53]

J. NeumannC. Schnorr and G. Steidl, Combined SVM-based feature selection and classification, Machine Learning, 61 (2005), 129-150.  doi: 10.1007/s10994-005-1505-9.

[54]

Y. Oh, K. Ransikarbum, M. Busogi, et al., Adaptive SVM-based real-time quality assessment for primer-sealer dispensing process of sunroof assembly line, Reliability Engineering & System Safety, 184 (2019), 202–212. doi: 10.1016/j.ress.2018.03.020.

[55]

A. V. PhanM. L. Nguyen and L. T. Bui, Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems, Appl. Intell., 46 (2017), 455-469.  doi: 10.1007/s10489-016-0843-6.

[56]

H. Purwins, et al., Regression methods for prediction of PECVD silicon nitride layer thickness, 2011 IEEE International Conference on Automation Science and Engineering, (2011), 387–392. doi: 10.1109/CASE.2011.6042426.

[57]

J. R. Quinlan, Induction on decision tree, Machine Learning, 1 (1986), 81-106.  doi: 10.1007/BF00116251.

[58]

M. S. RahmanM. K. RahmanM. Kaykobad and M. S. Rahman, isGPT: An optimized model to identify sub-Golgi protein types using SVM and random forest based feature selection, Artificial Intelligence in Medicine, 84 (2017), 90-100.  doi: 10.1016/j.artmed.2017.11.003.

[59]

I. Rish, An empirical study of the naive Bayes classifier, IJCAI 2001 Workshop on Empirical Methods in Artificial Intelligence, 3 (2001), 41-46. 

[60]

Y-C. SuM.-H. HungF.-T. Cheng and Y-T. Chen, A processing quality prognostics scheme for plasma sputtering in TFT-LCD manufacturing, IEEE Transactions on Semiconductor Manufacturing, 19 (2006), 183-194.  doi: 10.1109/TSM.2006.873514.

[61]

J. B. TenenbaumV. D. Silva and J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science, 290 (2000), 2319-2323.  doi: 10.1126/science.290.5500.2319.

[62]

A. UnlerA. Murat and R. B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Information Sciences, 181 (2011), 4625-4641.  doi: 10.1016/j.ins.2010.05.037.

[63]

Y. Wang, Z. Xue, G. Shen, et al., PRINTR: Prediction of RNA binding sites in proteins using SVM and profiles, Amino Acids, 35 (2008), 295–302. doi: 10.1007/s00726-007-0634-9.

[64]

R. Weber and J. Basak, Simultaneous feature selection and classification using kernel-penalized support vector machines, Information Sciences, 181 (2011), 115-128.  doi: 10.1016/j.ins.2010.08.047.

[65]

C. Wei, J. Chen, Z. Song and C. Chen, Soft sensors of nonlinear industrial processes based on self-learning kernel regression model, Asian Control Conference, IEEE, (2018), 1783–1788. doi: 10.1109/ASCC.2017.8287444.

[66]

X. WuL. ChenS. Pang and X. Ding, A paratactic subjective-objective weighting methods and SVM risk assessment model applied in textile and apparel safety, International Journal of Quality & Reliability Management, 32 (2015), 472-485.  doi: 10.1108/IJQRM-06-2013-0102.

[67]

Y. Xiang and L. Jiang, Water quality prediction using LS-SVM and particle swarm optimization, 2009 Second International Workshop on Knowledge Discovery and Data Mining, (2009), 900–904. doi: 10.1109/WKDD.2009.217.

[68]

S. D. Xin and C. B. Zhong, Some practical application of sequential analysis to the fault prediction system of a main diesel engine, Conference of the IEEE Industrial Electronics Society, IEEE, 3 (2002), 2151-2156.  doi: 10.1109/IECON.1991.239009.

[69]

H. Zhang, Q. Li, Z. Sun and Y. Liu, Combining data-driven and model-driven methods for robust facial landmark detection, IEEE Transactions on Information Forensics and Security, (2016), 2409–2422. doi: 10.1109/TIFS.2018.2800901.

[70]

H. Zhang, Q.-Y. Chen, M-L. Xiang, et al., In silico prediction of mitochondrial toxicity by using GA-CG-SVM approach, Toxicology in Vitro, 23 (2009), 134–140. doi: 10.1016/j.tiv.2008.09.017.

[71]

C. Zheng and L. Jiao, Automatic parameters selection for SVM based on GA, Intelligent Control & Automation, 2 (2004), 1869-1872.  doi: 10.1109/WCICA.2004.1341000.

[72]

X. Zhao, E. K. Wong, Y. Wang, et al., A support vector machine (SVM) for predicting preferred treatment position in radiotherapy of patients with breast cancer, Medical Physics, 37 (2010), 5341–5350. doi: 10.1118/1.3483264.

[73]

Y. Zhu, Y. Tan, Y. Hua, et al., Feature selection and performance evaluation of support vector machine (SVM)-based classifier for differentiating benign and malignant pulmonary nodules by computed tomography, Journal of Digital Imaging, 23 (2010), 51–65. doi: 10.1007/s10278-009-9185-9.

Figure 1.  Stages of semiconductor manufacturing
Figure 2.  Flowchart of proposed method
Figure 3.  Flowchart of DA-VNS algorithm
Figure 4.  Encoding of DA-VNS algorithm
Figure 5.  Running results after 500 iterations of GA, DA and DA-VNS algorithms respectively (The blue points indicate the projection of the solutions on $ (q(\theta),R(\theta)) $.)
Figure 6.  The evolution of the best $ q(\theta) $ and $ R(\theta) $ for GA, DA and DA-VNS respectively over 500 iterations
Figure 7.  Performance improvement between DA-VNS-SVM and other algorithms
Table 1.  Quality prediction problems in semiconductor manufacturing processes in recent years
Publications Problems Methods Data Driven
Fridgeirsdottir [73] Fault diagnosis Data mining
Kim [37] Prediction of plasma etch processes PNN
Bae [9] Modeling and rule extraction of the ingot fabrication DPNN
Su [59] Quality prognostics for plasma sputtering NN
Chou [16] Prediction of dynamic wafer quality SVM
Purwins [55] Prediction of Silicon Nitride layer thickness Collinearity regression
Melhem [49] Prediction of batch scrap Regularized regression
Alagic [5] Prediction of the damage intensity Image processing and statistical modeling
Al-Kharaz [4] Prediction of quality state ANN
Kim [36] Prediction of wafers errors Ordinary least squares regression and ridge regression
Publications Problems Methods Data Driven
Fridgeirsdottir [73] Fault diagnosis Data mining
Kim [37] Prediction of plasma etch processes PNN
Bae [9] Modeling and rule extraction of the ingot fabrication DPNN
Su [59] Quality prognostics for plasma sputtering NN
Chou [16] Prediction of dynamic wafer quality SVM
Purwins [55] Prediction of Silicon Nitride layer thickness Collinearity regression
Melhem [49] Prediction of batch scrap Regularized regression
Alagic [5] Prediction of the damage intensity Image processing and statistical modeling
Al-Kharaz [4] Prediction of quality state ANN
Kim [36] Prediction of wafers errors Ordinary least squares regression and ridge regression
Table 2.  Common Kernel Function
Kernel function name Kernel function representation
Radial basis function $ \kappa(x_i,x_j)=\exp(-\gamma ||x_i-x_j||) $
Linear kernel function $ \kappa(x_i,x_j)=x_i\cdot x_j $
Polynomial kernel function $ \kappa(x_i,x_j)=(x_i\cdot x_j+1)^d $
Sigmoid kernel function $ \kappa(x_i,x_j)=\tanh[n <x_i\cdot x_j >+\theta] $
Kernel function name Kernel function representation
Radial basis function $ \kappa(x_i,x_j)=\exp(-\gamma ||x_i-x_j||) $
Linear kernel function $ \kappa(x_i,x_j)=x_i\cdot x_j $
Polynomial kernel function $ \kappa(x_i,x_j)=(x_i\cdot x_j+1)^d $
Sigmoid kernel function $ \kappa(x_i,x_j)=\tanh[n <x_i\cdot x_j >+\theta] $
Table 3.  List of preset parameters in DA-VNS
Parameters $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $ $ DA-VNS_{\kappa_{2}} $
Population size 100 100 100
Iteration times 500 500 500
Nearest neighbor number / 5 5
Learning probability 0.8 0.8 0.8
Innovation probability 0.1 0.1 0.1
Mutation probability 0.1 0.1 0.1
Search range of penalty parameter $ C $ $ [10^{-3},10^{3}] $ $ [10^{-3},10^3] $ $ [10^{-3},10^3] $
Search range of kernel width $ \gamma $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $
Search range of amplitude regulating parameter $ t_1 $ / / $ [-10,10] $
Search range of displacement regulating parameter $ t_2 $ / / $ [-10,10] $
Luck coefficient {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5}
$ w_c $, $ w_f $, $ c_{f1} $, $ c_{f2} $ 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2
Parameters $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $ $ DA-VNS_{\kappa_{2}} $
Population size 100 100 100
Iteration times 500 500 500
Nearest neighbor number / 5 5
Learning probability 0.8 0.8 0.8
Innovation probability 0.1 0.1 0.1
Mutation probability 0.1 0.1 0.1
Search range of penalty parameter $ C $ $ [10^{-3},10^{3}] $ $ [10^{-3},10^3] $ $ [10^{-3},10^3] $
Search range of kernel width $ \gamma $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $
Search range of amplitude regulating parameter $ t_1 $ / / $ [-10,10] $
Search range of displacement regulating parameter $ t_2 $ / / $ [-10,10] $
Luck coefficient {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5}
$ w_c $, $ w_f $, $ c_{f1} $, $ c_{f2} $ 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2
Table 4.  Comparison of performance between DA-VNS and other algorithms
Algorithm Optimal parameters $ q(\theta) $ $ R(\theta) $ Selected features
$ C $ $ \gamma $ $ t_1 $ $ t_2 $
$ GA_{RBF} $ 78.54 3.36 / / 0.6136 0.725 60
$ GA_{\kappa_1} $ 73.11 0.80 / / 0.6217 0.7333 47
$ GA_{\kappa_2} $ 11.21 0.94 1.53 4.41 0.6262 0.7416 56
$ GA_{RBF} $ 96.10 0.52 / / 0.6329 0.75 64
$ DA_{\kappa_{1}} $ 62.34 0.49 / / 0.6559 0.775 43
$ DA_{\kappa_{1}} $ 42.04 0.49 9.81 9.96 0.6695 0.7917 41
IG / / / / 0.6957 0.8105 24
$ GA-VNS_{RBF} $ 21.32 1.31 / / 0.6882 0.8033 48
$ DA-VNS{\kappa_{1}} $ 60.82 1.60 / / 0.7038 0.8333 36
$ DA-VNS{\kappa_{2}} $ 96 0.91 -2.69 8.77 0.7221 0.8583 49
Algorithm Optimal parameters $ q(\theta) $ $ R(\theta) $ Selected features
$ C $ $ \gamma $ $ t_1 $ $ t_2 $
$ GA_{RBF} $ 78.54 3.36 / / 0.6136 0.725 60
$ GA_{\kappa_1} $ 73.11 0.80 / / 0.6217 0.7333 47
$ GA_{\kappa_2} $ 11.21 0.94 1.53 4.41 0.6262 0.7416 56
$ GA_{RBF} $ 96.10 0.52 / / 0.6329 0.75 64
$ DA_{\kappa_{1}} $ 62.34 0.49 / / 0.6559 0.775 43
$ DA_{\kappa_{1}} $ 42.04 0.49 9.81 9.96 0.6695 0.7917 41
IG / / / / 0.6957 0.8105 24
$ GA-VNS_{RBF} $ 21.32 1.31 / / 0.6882 0.8033 48
$ DA-VNS{\kappa_{1}} $ 60.82 1.60 / / 0.7038 0.8333 36
$ DA-VNS{\kappa_{2}} $ 96 0.91 -2.69 8.77 0.7221 0.8583 49
Table 5.  Performance improvement of $ q(\theta) $ between DA-VNS and other algorithms. $ Improvement = \frac{q(\theta)-q^{\prime}(\theta)}{q(\theta)}*100\% $
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 12.82 11.66 11.02 10.07 6.81 4.87 1.15 2.22 /
$ DA-VNS{\kappa_{2}} $ 15.03 13.90 13.28 12.35 9.17 7.28 3.66 4.69 2.53
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 12.82 11.66 11.02 10.07 6.81 4.87 1.15 2.22 /
$ DA-VNS{\kappa_{2}} $ 15.03 13.90 13.28 12.35 9.17 7.28 3.66 4.69 2.53
Table 6.  Performance improvement of $ R(\theta) $ between DA-VNS and other algorithms. $ Improvement = \frac{R(\theta)-R^{\prime}(\theta)}{R(\theta)}*100\% $
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 13.00 12.00 11.00 10.00 7.00 4.99 2.74 3.60 /
$ DA-VNS{\kappa_{2}} $ 15.53 14.56 13.60 12.62 9.70 7.76 5.57 6.41 2.91
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 13.00 12.00 11.00 10.00 7.00 4.99 2.74 3.60 /
$ DA-VNS{\kappa_{2}} $ 15.53 14.56 13.60 12.62 9.70 7.76 5.57 6.41 2.91
Table 7.  Comparison of performance between DA-VNS-SVM and common machine learning algorithms
Algorithm Accuracy
Logistic Regression (LR) 0.4917
Naive Bayes (NB) 0.6167
Artificial Neural Network (ANN) 0.6417
Decision Tree (DT) 0.658
Random Forest (RF) 0.667
DA-VNS$ _{\kappa_1} $-SVM 0.7038
DA-VNS$ _{\kappa_2} $-SVM 0.7221
Algorithm Accuracy
Logistic Regression (LR) 0.4917
Naive Bayes (NB) 0.6167
Artificial Neural Network (ANN) 0.6417
Decision Tree (DT) 0.658
Random Forest (RF) 0.667
DA-VNS$ _{\kappa_1} $-SVM 0.7038
DA-VNS$ _{\kappa_2} $-SVM 0.7221
[1]

Huiqin Zhang, JinChun Wang, Meng Wang, Xudong Chen. Integration of cuckoo search and fuzzy support vector machine for intelligent diagnosis of production process quality. Journal of Industrial and Management Optimization, 2022, 18 (1) : 195-217. doi: 10.3934/jimo.2020150

[2]

Yubo Yuan, Weiguo Fan, Dongmei Pu. Spline function smooth support vector machine for classification. Journal of Industrial and Management Optimization, 2007, 3 (3) : 529-542. doi: 10.3934/jimo.2007.3.529

[3]

Xin Yan, Hongmiao Zhu. A kernel-free fuzzy support vector machine with Universum. Journal of Industrial and Management Optimization, 2021  doi: 10.3934/jimo.2021184

[4]

Xin Li, Ziguan Cui, Linhui Sun, Guanming Lu, Debnath Narayan. Research on iterative repair algorithm of Hyperchaotic image based on support vector machine. Discrete and Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1199-1218. doi: 10.3934/dcdss.2019083

[5]

Mohamed A. Tawhid, Kevin B. Dsouza. Hybrid binary dragonfly enhanced particle swarm optimization algorithm for solving feature selection problems. Mathematical Foundations of Computing, 2018, 1 (2) : 181-200. doi: 10.3934/mfc.2018009

[6]

Mohammed Abdulrazaq Kahya, Suhaib Abduljabbar Altamir, Zakariya Yahya Algamal. Improving whale optimization algorithm for feature selection with a time-varying transfer function. Numerical Algebra, Control and Optimization, 2021, 11 (1) : 87-98. doi: 10.3934/naco.2020017

[7]

Yubo Yuan. Canonical duality solution for alternating support vector machine. Journal of Industrial and Management Optimization, 2012, 8 (3) : 611-621. doi: 10.3934/jimo.2012.8.611

[8]

K. Schittkowski. Optimal parameter selection in support vector machines. Journal of Industrial and Management Optimization, 2005, 1 (4) : 465-476. doi: 10.3934/jimo.2005.1.465

[9]

Ying Lin, Qi Ye. Support vector machine classifiers by non-Euclidean margins. Mathematical Foundations of Computing, 2020, 3 (4) : 279-300. doi: 10.3934/mfc.2020018

[10]

Jian Luo, Shu-Cherng Fang, Yanqin Bai, Zhibin Deng. Fuzzy quadratic surface support vector machine based on fisher discriminant analysis. Journal of Industrial and Management Optimization, 2016, 12 (1) : 357-373. doi: 10.3934/jimo.2016.12.357

[11]

Fatemeh Bazikar, Saeed Ketabchi, Hossein Moosaei. Smooth augmented Lagrangian method for twin bounded support vector machine. Numerical Algebra, Control and Optimization, 2021  doi: 10.3934/naco.2021027

[12]

Ahmad Mousavi, Zheming Gao, Lanshan Han, Alvin Lim. Quadratic surface support vector machine with L1 norm regularization. Journal of Industrial and Management Optimization, 2022, 18 (3) : 1835-1861. doi: 10.3934/jimo.2021046

[13]

Ning Lu, Ying Liu. Application of support vector machine model in wind power prediction based on particle swarm optimization. Discrete and Continuous Dynamical Systems - S, 2015, 8 (6) : 1267-1276. doi: 10.3934/dcdss.2015.8.1267

[14]

Qianru Zhai, Ye Tian, Jingyue Zhou. A SMOTE-based quadratic surface support vector machine for imbalanced classification with mislabeled information. Journal of Industrial and Management Optimization, 2022  doi: 10.3934/jimo.2021230

[15]

Mostafa Abouei Ardakan, A. Kourank Beheshti, S. Hamid Mirmohammadi, Hamed Davari Ardakani. A hybrid meta-heuristic algorithm to minimize the number of tardy jobs in a dynamic two-machine flow shop problem. Numerical Algebra, Control and Optimization, 2017, 7 (4) : 465-480. doi: 10.3934/naco.2017029

[16]

Ye Tian, Wei Yang, Gene Lai, Menghan Zhao. Predicting non-life insurer's insolvency using non-kernel fuzzy quadratic surface support vector machines. Journal of Industrial and Management Optimization, 2019, 15 (2) : 985-999. doi: 10.3934/jimo.2018081

[17]

Jianguo Dai, Wenxue Huang, Yuanyi Pan. A category-based probabilistic approach to feature selection. Big Data & Information Analytics, 2018  doi: 10.3934/bdia.2017020

[18]

Behrouz Kheirfam. A full Nesterov-Todd step infeasible interior-point algorithm for symmetric optimization based on a specific kernel function. Numerical Algebra, Control and Optimization, 2013, 3 (4) : 601-614. doi: 10.3934/naco.2013.3.601

[19]

Ayache Benhadid, Fateh Merahi. Complexity analysis of an interior-point algorithm for linear optimization based on a new parametric kernel function with a double barrier term. Numerical Algebra, Control and Optimization, 2022  doi: 10.3934/naco.2022003

[20]

Wei Fu, Jun Liu, Yirong Lai. Collaborative filtering recommendation algorithm towards intelligent community. Discrete and Continuous Dynamical Systems - S, 2019, 12 (4&5) : 811-822. doi: 10.3934/dcdss.2019054

2021 Impact Factor: 1.411

Metrics

  • PDF downloads (589)
  • HTML views (693)
  • Cited by (0)

[Back to Top]