• Previous Article
    A self adaptive inertial algorithm for solving split variational inclusion and fixed point problems with applications
  • JIMO Home
  • This Issue
  • Next Article
    Solution method for discrete double obstacle problems based on a power penalty approach
doi: 10.3934/jimo.2021099

A novel quality prediction method based on feature selection considering high dimensional product quality data

1. 

School of Management, Hefei University of Technology, Hefei 230009, China

2. 

Center for Applied Optimization, Department of Industrial and Systems, Engineering University of Florida, Gainesville, FL 32611, USA

3. 

Key Laboratory of Process Optimization and Intelligent Decision-making, of the Ministry of Education, Hefei 230009, China

4. 

School of Economics, Hefei University of Technology, Hefei 230009, China

5. 

Ministry of Education Engineering Research Center for Intelligent Decision-Making & Information System Technologies, Hefei 230009, China

* Corresponding author: Xiaofei Qian, Xinbao Liu

Received  December 2019 Revised  January 2021 Published  May 2021

Product quality is the lifeline of enterprise survival and development. With the rapid development of information technology, the semiconductor manufacturing process produces multitude of quality features. Due to the increasing quality features, the requirement on the training time and classification accuracy of quality prediction methods becomes increasingly higher. Aiming at realizing the quality prediction for semiconductor manufacturing process, this paper proposes a modified support vector machine (SVM) model based on feature selection, considering the high dimensional and nonlinear characteristics of data. The model first improves the Radial Basis Function (RBF) in SVM, and then combines the Duelist algorithm (DA) and variable neighborhood search algorithm (VNS) for feature selection and parameters optimization. Compared with some other SVM models that are based on DA, genetic algorithm (GA), and Information Gain algorithm (IG), the experiment results show that our DA-VNS-SVM can obtain higher classification accuracy rate with a smaller feature subset. In addition, we compare the DA-VNS-SVM with some common machine learning algorithms such as logistic regression, naive Bayes, decision tree, random forest, and artificial neural network. The results indicate that our model outperform these machine learning algorithms for the quality prediction of semiconductor.

Citation: Junying Hu, Xiaofei Qian, Jun Pei, Changchun Tan, Panos M. Pardalos, Xinbao Liu. A novel quality prediction method based on feature selection considering high dimensional product quality data. Journal of Industrial & Management Optimization, doi: 10.3934/jimo.2021099
References:
[1]

M. M. Adankon and M. Cheriet, Model selection for the LS-SVM. Application to handwriting recognition, Pattern Recognition, 42 (2009), 3264-3270.  doi: 10.1016/j.patcog.2008.10.023.  Google Scholar

[2]

S. Agatonovic-Kustrin and R. Beresford, Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research, Journal of Pharmaceutical & Biomedical Analysis, 22 (2000), 717-727.  doi: 10.1016/S0731-7085(99)00272-1.  Google Scholar

[3]

M. A. Ahmadi and A. Bahadori, A LSSVM approach for determining well placement and conning phenomena in horizontal wells, Fuel, 153 (2015), 276-283.  doi: 10.1016/j.fuel.2015.02.094.  Google Scholar

[4]

M. Al-Kharaz, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Quality prediction in semiconductor manufacturing processes using multilayer perceptron feedforward artificial neural network, 2019 8th International Conference on Systems and Control (ICSC), (2019), 423–428. doi: 10.1109/ICSC47195.2019.8950664.  Google Scholar

[5]

D. Alagić, O. Bluder and J. Pilz, Quantification and prediction of damage in SAM images of semiconductor devices, International Conference Image Analysis and Recognition, Lecture Notes in Computer Science, 10882, Springer, Cham, 2018,490–496. doi: 10.1007/978-3-319-93000-8_55.  Google Scholar

[6]

E. Alba, J. Garcia-Nieto, L. Jourdan and E. Talbi, Gene selection in cancer classification using GPSO/SVM and GA/SVM hybrid algorithms, IEEE Congress on Evolutionary Computation, IEEE, (2008), 284–290. doi: 10.1109/CEC.2007.4424483.  Google Scholar

[7]

N. Allias, M. N. M. M. Noor, M. N. Ismail, K. de Silva, A hybrid Gini PSO-SVM feature selection: An empirical study of population sizes on different classifier, International Conference on Artificial Intelligence, (2013), 107–110. doi: 10.1109/AIMS.2013.24.  Google Scholar

[8]

, Asuncion A, Newman DJ. UCI Machine Learning Repository, 2007. Google Scholar

[9]

H. BaeS. S. KimK. B. WooG. S. May and D. K. Lee, Fault detection, diagnosis, and optimization of wafer manufacturing processes utilizing knowledge creation, International Journal of Control, Automation, and Systems, 4 (2006), 372-381.   Google Scholar

[10]

M. G. BariX. Ma and J. Zhang, PeakLink: A new peptide peak linking method in LC-MS/MS using wavelet and SVM, Bioinformatics, 30 (2014), 2464-2470.  doi: 10.1093/bioinformatics/btu299.  Google Scholar

[11]

A. Bazzani, A. Bevilacqua, D. Bollini, et al., An SVM classifier to separate false signals from microcalcifications in digital mammograms, Physics in Medicine and Biology, 46 (2001), 1651–1651. doi: 10.1088/0031-9155/46/6/305.  Google Scholar

[12]

T. R. Biyanto, et al., Duelist algorithm: An algorithm inspired by how duelist improve their capabilities in a duel, The Seventh International Conference on Swarm Intelegence, 2016, 39–47. doi: 10.1007/978-3-319-41000-5_4.  Google Scholar

[13]

B. Bonev, Feature selection based on information theory, Universidad de Alicante, 2010. Available from: http://hdl.handle.net/10045/18362. Google Scholar

[14]

W. Chen, Z. Li and J. Guo, A VNS-EDA algorithm-based feature selection for credit risk classification, Mathematical Problems in Engineering, 2020 (2020), 14 pp. doi: 10.1155/2020/4515480.  Google Scholar

[15]

V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, 17 (2004), 113-126.  doi: 10.1016/S0893-6080(03)00169-2.  Google Scholar

[16]

P.-H. ChouM-J. Wu and K.-K. Chen, Integrating support vector machine and genetic algorithm to implement dynamic wafer quality prediction system, Expert Systems with Applications, 37 (2010), 4413-4424.  doi: 10.1016/j.eswa.2009.11.087.  Google Scholar

[17]

C. Cortes and V. Vapnik, Support-Vector Networks, Machine Learning, 20 (1995), 273-297.  doi: 10.1007/BF00994018.  Google Scholar

[18]

H. CostaL. R. GalvaoL. H. C. Merschmann and M. J. F. Souza, A VNS algorithm for feature selection in hierarchical classification context, Electronic Notes in Discrete Mathematics, 66 (2018), 79-86.  doi: 10.1016/j.endm.2018.03.011.  Google Scholar

[19]

N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, Cambridge University Press, 2000. Google Scholar

[20]

K. Fridgeirsdottir, R. Akella, L.-M. Al, Statistical methodology for yield enhancement via baseline reduction, Advanced Semiconductor Manufacturing Conference & Workshop, (1998), 77–81. doi: 10.1109/ASMC.1998.731402.  Google Scholar

[21]

J. DerracC. CornelisS. García and F. Herrera, Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection, Information Sciences, 186 (2012), 73-92.  doi: 10.1016/j.ins.2011.09.027.  Google Scholar

[22]

S. Dong, Y. Zhang, Z. He, et al., Investigation of support vector machine and back propagation artificial neural network for performance prediction of the organic Rankine cycle system, Energy, 144 (2018), 851–864. doi: 10.1016/j.energy.2017.12.094.  Google Scholar

[23]

R. DongJ. Xu and B. Lin, ROI-based study on impact factors of distributed PV projects by LSSVM-PSO, Energy, 124 (2017), 336-349.  doi: 10.1016/j.energy.2017.02.056.  Google Scholar

[24]

A. L. Ellefsen, E. Bjorlykhaug, V. Esoy, et al., Remaining useful life predictions for turbofan engine degradation using semi-supervised deep architecture, Reliability Engineering & System Safety, 183 (2019), 240-251.  Google Scholar

[25]

A. O. Filho, A. C. Silva, A. C. de Paiva, et al., Computer-aided diagnosis of lung nodules in computed tomography by using phylogenetic diversity, genetic algorithm, and SVM, Journal of Digital Imaging, 30 (2017), 812–822. doi: 10.1007/s10278-017-9973-6.  Google Scholar

[26]

B. FrénayG. Doquire and M. Verleysen, Is mutual information adequate for feature selection in regression, Neural Networks, 48 (2013), 1-7.  doi: 10.1016/j.neunet.2013.07.003.  Google Scholar

[27]

R. Fujimaki, T. Yairi and K. Machida, An anomaly detection method for spacecraft using relevance vector learning, Pacific-Asia Conference on Knowledge Discovery & Data Mining, Lecture Notes in Computer Science, 3518, Springer, Berlin, Heidelberg, 2005,785–790. doi: 10.1007/11430919_92.  Google Scholar

[28]

M. Garcia-Torres, F. C. Garcia-López, B. Melián-Batista, J. A. Moreno-Pérez and J. M. Moreno-Vega, Solving feature subset selection problem by a hybrid, Hybrid Metaheuristics, (2004), 59–68. Google Scholar

[29]

M. Garcia-Torres, R. Armananzas, C. Bielza, et al., Comparison of metaheuristic strategies for peakbin selection in proteomic mass spectrometry data, Information Sciences, 222 (2013), 229–246. doi: 10.1016/j.ins.2010.12.013.  Google Scholar

[30]

M. García-TorresF. Gómez-VelaB. Melián-Batista and M. Moreno-Vega, High-dimensional feature selection via feature grouping: A variable neighborhood search approach, Information Sciences, 326 (2016), 102-118.  doi: 10.1016/j.ins.2015.07.041.  Google Scholar

[31]

J. HuaW. D. Tembe and E. R. Dougherty, Performance of feature-selection methods in the classification of high-dimension data, Pattern Recognition, 42 (2009), 409-424.  doi: 10.1016/j.patcog.2008.08.001.  Google Scholar

[32]

C.-L. Huang and J.-F. Dun, A distributed PSO-SVM hybrid system with feature selection and parameter optimization, Applied Soft Computing, 8 (2008), 1381-1391.  doi: 10.1016/j.asoc.2007.10.007.  Google Scholar

[33]

P. Janik and T. Lobos, Automated classification of power-quality disturbances using SVM and RBF networks, IEEE Transactions on Power Delivery, 21 (2006), 1663-1669.  doi: 10.1109/TPWRD.2006.874114.  Google Scholar

[34]

A. C. Janssens, Y. Deng, G. J. Borsboom, et al., A new logistic regression approach for the evaluation of diagnostic test results, Medical Decision Making, 25 (2005), 168–177. doi: 10.1177/0272989X05275154.  Google Scholar

[35]

Y.-S. JeongB. Kim and Y-D. Ko, Exponentially weighted moving average-based procedure with adaptive thresholding for monitoring nonlinear profiles: Monitoring of plasma etch process in semiconductor manufacturing, Expert Systems with Applications, 40 (2013), 5688-5693.  doi: 10.1016/j.eswa.2013.04.016.  Google Scholar

[36]

S. S. Keerthi and E. G. Gilbert, Convergence of a generalized SMO algorithm for SVM classifier design, Machine Learning, 46 (2002), 351-360.  doi: 10.1023/A:1012431217818.  Google Scholar

[37]

S. J. Kim, H. G. Yoon, K. B. Lee, et al., Hybrid overlay modeling for field-by-field error correction in the photolithography process, IEEE Transactions on Semiconductor Manufacturing, 33 (2020), 53–61. doi: 10.1109/TSM.2019.2957508.  Google Scholar

[38]

B. KimD. W. Kim and G. T. Park, Prediction of plasma etching using a polynomial neural network, IEEE Transactions on Plasma Science, 31 (2003), 1330-1336.  doi: 10.1109/TPS.2003.820681.  Google Scholar

[39]

T. KolodziejczykR. ToscanoS. Fouvry and G. Morales-Espejel, Artificial intelligence as efficient technique for ball bearing fretting wear damage prediction, Wear, 268 (2010), 309-315.  doi: 10.1016/j.wear.2009.08.016.  Google Scholar

[40]

M. Kumar, M. Bhasin, N. K. Natt, et al. BhairPred: prediction of β-hairpins in a protein from multiple alignment information using ANN and SVM techniques, Nucleic Acids Research, 33 (2015), 154–159. doi: 10.1093/nar/gki588.  Google Scholar

[41]

Y.-X. LaiC-F. LaiY-M. Huang and H.-C. Chao, Multi-appliance recognition system with hybrid SVM/GMM classifier in ubiquitous smart home, Information Sciences, 230 (2013), 39-55.  doi: 10.1016/j.ins.2012.10.002.  Google Scholar

[42]

M. Last and A. Kandel, Data Mining for Process and Quality Control in the Semiconductor Industry, 3, Data Mining for Design and Manufacturing, Springer, Boston, MA, 2001. doi: 10.1007/978-1-4757-4911-3_9.  Google Scholar

[43]

H. LiC.-J. LiX.-J. Wu and J. Sun, Statistics-based wrapper for feature selection: An implementation on financial distress identification with support vector machine, Applied Soft Computing, 19 (2014), 57-67.  doi: 10.1016/j.asoc.2014.01.018.  Google Scholar

[44]

S. LiH. WuD. Wan and J. Zhu, An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine, Knowledge-Based Systems, 24 (2011), 40-48.  doi: 10.1016/j.knosys.2010.07.003.  Google Scholar

[45]

Y. Li and X. Zhang, Diffusion maps based k-nearest-neighbor rule technique for semiconductor manufacturing process fault detection, Chemometrics and Intelligent Laboratory Systems, 136 (2014), 47-57.  doi: 10.1016/j.chemolab.2014.05.003.  Google Scholar

[46]

H. Liu and H. Motoda, Feature extraction construction and selection: A data mining perspective, Journal of the American Statistical Association, 94 (1999), 014004. Google Scholar

[47]

J. LongS. Zhang and C. Li, Evolving deep echo state networks for intelligent fault diagnosis, IEEE Transactions on Industrial Informatics, 16 (2020), 4928-4937.  doi: 10.1109/TII.2019.2938884.  Google Scholar

[48]

J. LongZ. SunC. LiY. HongY. Bai and S. Zhang, A novel sparse echo autoencoder network for data-driven fault diagnosis of delta 3-D printers, IEEE Transactions on Instrumentation and Measurement, 69 (2020), 683-692.  doi: 10.1109/TIM.2019.2905752.  Google Scholar

[49]

U. MaulikA. Mukhopadhyay and D. Chakraborty, Gene-expression-based cancer subtypes prediction through feature selection and transductive SVM, IEEE Transactions on Biomedical Engineering, 60 (2013), 1111-1117.  doi: 10.1109/TBME.2012.2225622.  Google Scholar

[50]

M. Melhem, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Product quality prediction using alarm data : Application to the semiconductor manufacturing process, 2017 25th Mediterranean Conference on Control and Automation (MED), (2017), 1332–1338. doi: 10.1109/MED.2017.7984303.  Google Scholar

[51]

L. Mönch, J. W. Fowler and S. J. Mason, Production Planning and Control for Semiconductor Wafer Fabrication Facilities: Modeling, Analysis, and Systems, 52, Springer Science & Business Media, Springer, New York, 2012. Google Scholar

[52]

A. Mucherino and L. Liberti, A VNS-based heuristic for feature selection in data mining, Hybrid Metaheuristics, Studies in Computational Intelligence, 434, Springer, Berlin, Heidelberg, 2013,353–368. doi: 10.1007/978-3-642-30671-6_13.  Google Scholar

[53]

J. NeumannC. Schnorr and G. Steidl, Combined SVM-based feature selection and classification, Machine Learning, 61 (2005), 129-150.  doi: 10.1007/s10994-005-1505-9.  Google Scholar

[54]

Y. Oh, K. Ransikarbum, M. Busogi, et al., Adaptive SVM-based real-time quality assessment for primer-sealer dispensing process of sunroof assembly line, Reliability Engineering & System Safety, 184 (2019), 202–212. doi: 10.1016/j.ress.2018.03.020.  Google Scholar

[55]

A. V. PhanM. L. Nguyen and L. T. Bui, Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems, Appl. Intell., 46 (2017), 455-469.  doi: 10.1007/s10489-016-0843-6.  Google Scholar

[56]

H. Purwins, et al., Regression methods for prediction of PECVD silicon nitride layer thickness, 2011 IEEE International Conference on Automation Science and Engineering, (2011), 387–392. doi: 10.1109/CASE.2011.6042426.  Google Scholar

[57]

J. R. Quinlan, Induction on decision tree, Machine Learning, 1 (1986), 81-106.  doi: 10.1007/BF00116251.  Google Scholar

[58]

M. S. RahmanM. K. RahmanM. Kaykobad and M. S. Rahman, isGPT: An optimized model to identify sub-Golgi protein types using SVM and random forest based feature selection, Artificial Intelligence in Medicine, 84 (2017), 90-100.  doi: 10.1016/j.artmed.2017.11.003.  Google Scholar

[59]

I. Rish, An empirical study of the naive Bayes classifier, IJCAI 2001 Workshop on Empirical Methods in Artificial Intelligence, 3 (2001), 41-46.   Google Scholar

[60]

Y-C. SuM.-H. HungF.-T. Cheng and Y-T. Chen, A processing quality prognostics scheme for plasma sputtering in TFT-LCD manufacturing, IEEE Transactions on Semiconductor Manufacturing, 19 (2006), 183-194.  doi: 10.1109/TSM.2006.873514.  Google Scholar

[61]

J. B. TenenbaumV. D. Silva and J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science, 290 (2000), 2319-2323.  doi: 10.1126/science.290.5500.2319.  Google Scholar

[62]

A. UnlerA. Murat and R. B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Information Sciences, 181 (2011), 4625-4641.  doi: 10.1016/j.ins.2010.05.037.  Google Scholar

[63]

Y. Wang, Z. Xue, G. Shen, et al., PRINTR: Prediction of RNA binding sites in proteins using SVM and profiles, Amino Acids, 35 (2008), 295–302. doi: 10.1007/s00726-007-0634-9.  Google Scholar

[64]

R. Weber and J. Basak, Simultaneous feature selection and classification using kernel-penalized support vector machines, Information Sciences, 181 (2011), 115-128.  doi: 10.1016/j.ins.2010.08.047.  Google Scholar

[65]

C. Wei, J. Chen, Z. Song and C. Chen, Soft sensors of nonlinear industrial processes based on self-learning kernel regression model, Asian Control Conference, IEEE, (2018), 1783–1788. doi: 10.1109/ASCC.2017.8287444.  Google Scholar

[66]

X. WuL. ChenS. Pang and X. Ding, A paratactic subjective-objective weighting methods and SVM risk assessment model applied in textile and apparel safety, International Journal of Quality & Reliability Management, 32 (2015), 472-485.  doi: 10.1108/IJQRM-06-2013-0102.  Google Scholar

[67]

Y. Xiang and L. Jiang, Water quality prediction using LS-SVM and particle swarm optimization, 2009 Second International Workshop on Knowledge Discovery and Data Mining, (2009), 900–904. doi: 10.1109/WKDD.2009.217.  Google Scholar

[68]

S. D. Xin and C. B. Zhong, Some practical application of sequential analysis to the fault prediction system of a main diesel engine, Conference of the IEEE Industrial Electronics Society, IEEE, 3 (2002), 2151-2156.  doi: 10.1109/IECON.1991.239009.  Google Scholar

[69]

H. Zhang, Q. Li, Z. Sun and Y. Liu, Combining data-driven and model-driven methods for robust facial landmark detection, IEEE Transactions on Information Forensics and Security, (2016), 2409–2422. doi: 10.1109/TIFS.2018.2800901.  Google Scholar

[70]

H. Zhang, Q.-Y. Chen, M-L. Xiang, et al., In silico prediction of mitochondrial toxicity by using GA-CG-SVM approach, Toxicology in Vitro, 23 (2009), 134–140. doi: 10.1016/j.tiv.2008.09.017.  Google Scholar

[71]

C. Zheng and L. Jiao, Automatic parameters selection for SVM based on GA, Intelligent Control & Automation, 2 (2004), 1869-1872.  doi: 10.1109/WCICA.2004.1341000.  Google Scholar

[72]

X. Zhao, E. K. Wong, Y. Wang, et al., A support vector machine (SVM) for predicting preferred treatment position in radiotherapy of patients with breast cancer, Medical Physics, 37 (2010), 5341–5350. doi: 10.1118/1.3483264.  Google Scholar

[73]

Y. Zhu, Y. Tan, Y. Hua, et al., Feature selection and performance evaluation of support vector machine (SVM)-based classifier for differentiating benign and malignant pulmonary nodules by computed tomography, Journal of Digital Imaging, 23 (2010), 51–65. doi: 10.1007/s10278-009-9185-9.  Google Scholar

show all references

References:
[1]

M. M. Adankon and M. Cheriet, Model selection for the LS-SVM. Application to handwriting recognition, Pattern Recognition, 42 (2009), 3264-3270.  doi: 10.1016/j.patcog.2008.10.023.  Google Scholar

[2]

S. Agatonovic-Kustrin and R. Beresford, Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research, Journal of Pharmaceutical & Biomedical Analysis, 22 (2000), 717-727.  doi: 10.1016/S0731-7085(99)00272-1.  Google Scholar

[3]

M. A. Ahmadi and A. Bahadori, A LSSVM approach for determining well placement and conning phenomena in horizontal wells, Fuel, 153 (2015), 276-283.  doi: 10.1016/j.fuel.2015.02.094.  Google Scholar

[4]

M. Al-Kharaz, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Quality prediction in semiconductor manufacturing processes using multilayer perceptron feedforward artificial neural network, 2019 8th International Conference on Systems and Control (ICSC), (2019), 423–428. doi: 10.1109/ICSC47195.2019.8950664.  Google Scholar

[5]

D. Alagić, O. Bluder and J. Pilz, Quantification and prediction of damage in SAM images of semiconductor devices, International Conference Image Analysis and Recognition, Lecture Notes in Computer Science, 10882, Springer, Cham, 2018,490–496. doi: 10.1007/978-3-319-93000-8_55.  Google Scholar

[6]

E. Alba, J. Garcia-Nieto, L. Jourdan and E. Talbi, Gene selection in cancer classification using GPSO/SVM and GA/SVM hybrid algorithms, IEEE Congress on Evolutionary Computation, IEEE, (2008), 284–290. doi: 10.1109/CEC.2007.4424483.  Google Scholar

[7]

N. Allias, M. N. M. M. Noor, M. N. Ismail, K. de Silva, A hybrid Gini PSO-SVM feature selection: An empirical study of population sizes on different classifier, International Conference on Artificial Intelligence, (2013), 107–110. doi: 10.1109/AIMS.2013.24.  Google Scholar

[8]

, Asuncion A, Newman DJ. UCI Machine Learning Repository, 2007. Google Scholar

[9]

H. BaeS. S. KimK. B. WooG. S. May and D. K. Lee, Fault detection, diagnosis, and optimization of wafer manufacturing processes utilizing knowledge creation, International Journal of Control, Automation, and Systems, 4 (2006), 372-381.   Google Scholar

[10]

M. G. BariX. Ma and J. Zhang, PeakLink: A new peptide peak linking method in LC-MS/MS using wavelet and SVM, Bioinformatics, 30 (2014), 2464-2470.  doi: 10.1093/bioinformatics/btu299.  Google Scholar

[11]

A. Bazzani, A. Bevilacqua, D. Bollini, et al., An SVM classifier to separate false signals from microcalcifications in digital mammograms, Physics in Medicine and Biology, 46 (2001), 1651–1651. doi: 10.1088/0031-9155/46/6/305.  Google Scholar

[12]

T. R. Biyanto, et al., Duelist algorithm: An algorithm inspired by how duelist improve their capabilities in a duel, The Seventh International Conference on Swarm Intelegence, 2016, 39–47. doi: 10.1007/978-3-319-41000-5_4.  Google Scholar

[13]

B. Bonev, Feature selection based on information theory, Universidad de Alicante, 2010. Available from: http://hdl.handle.net/10045/18362. Google Scholar

[14]

W. Chen, Z. Li and J. Guo, A VNS-EDA algorithm-based feature selection for credit risk classification, Mathematical Problems in Engineering, 2020 (2020), 14 pp. doi: 10.1155/2020/4515480.  Google Scholar

[15]

V. Cherkassky and Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, 17 (2004), 113-126.  doi: 10.1016/S0893-6080(03)00169-2.  Google Scholar

[16]

P.-H. ChouM-J. Wu and K.-K. Chen, Integrating support vector machine and genetic algorithm to implement dynamic wafer quality prediction system, Expert Systems with Applications, 37 (2010), 4413-4424.  doi: 10.1016/j.eswa.2009.11.087.  Google Scholar

[17]

C. Cortes and V. Vapnik, Support-Vector Networks, Machine Learning, 20 (1995), 273-297.  doi: 10.1007/BF00994018.  Google Scholar

[18]

H. CostaL. R. GalvaoL. H. C. Merschmann and M. J. F. Souza, A VNS algorithm for feature selection in hierarchical classification context, Electronic Notes in Discrete Mathematics, 66 (2018), 79-86.  doi: 10.1016/j.endm.2018.03.011.  Google Scholar

[19]

N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods, Cambridge University Press, 2000. Google Scholar

[20]

K. Fridgeirsdottir, R. Akella, L.-M. Al, Statistical methodology for yield enhancement via baseline reduction, Advanced Semiconductor Manufacturing Conference & Workshop, (1998), 77–81. doi: 10.1109/ASMC.1998.731402.  Google Scholar

[21]

J. DerracC. CornelisS. García and F. Herrera, Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection, Information Sciences, 186 (2012), 73-92.  doi: 10.1016/j.ins.2011.09.027.  Google Scholar

[22]

S. Dong, Y. Zhang, Z. He, et al., Investigation of support vector machine and back propagation artificial neural network for performance prediction of the organic Rankine cycle system, Energy, 144 (2018), 851–864. doi: 10.1016/j.energy.2017.12.094.  Google Scholar

[23]

R. DongJ. Xu and B. Lin, ROI-based study on impact factors of distributed PV projects by LSSVM-PSO, Energy, 124 (2017), 336-349.  doi: 10.1016/j.energy.2017.02.056.  Google Scholar

[24]

A. L. Ellefsen, E. Bjorlykhaug, V. Esoy, et al., Remaining useful life predictions for turbofan engine degradation using semi-supervised deep architecture, Reliability Engineering & System Safety, 183 (2019), 240-251.  Google Scholar

[25]

A. O. Filho, A. C. Silva, A. C. de Paiva, et al., Computer-aided diagnosis of lung nodules in computed tomography by using phylogenetic diversity, genetic algorithm, and SVM, Journal of Digital Imaging, 30 (2017), 812–822. doi: 10.1007/s10278-017-9973-6.  Google Scholar

[26]

B. FrénayG. Doquire and M. Verleysen, Is mutual information adequate for feature selection in regression, Neural Networks, 48 (2013), 1-7.  doi: 10.1016/j.neunet.2013.07.003.  Google Scholar

[27]

R. Fujimaki, T. Yairi and K. Machida, An anomaly detection method for spacecraft using relevance vector learning, Pacific-Asia Conference on Knowledge Discovery & Data Mining, Lecture Notes in Computer Science, 3518, Springer, Berlin, Heidelberg, 2005,785–790. doi: 10.1007/11430919_92.  Google Scholar

[28]

M. Garcia-Torres, F. C. Garcia-López, B. Melián-Batista, J. A. Moreno-Pérez and J. M. Moreno-Vega, Solving feature subset selection problem by a hybrid, Hybrid Metaheuristics, (2004), 59–68. Google Scholar

[29]

M. Garcia-Torres, R. Armananzas, C. Bielza, et al., Comparison of metaheuristic strategies for peakbin selection in proteomic mass spectrometry data, Information Sciences, 222 (2013), 229–246. doi: 10.1016/j.ins.2010.12.013.  Google Scholar

[30]

M. García-TorresF. Gómez-VelaB. Melián-Batista and M. Moreno-Vega, High-dimensional feature selection via feature grouping: A variable neighborhood search approach, Information Sciences, 326 (2016), 102-118.  doi: 10.1016/j.ins.2015.07.041.  Google Scholar

[31]

J. HuaW. D. Tembe and E. R. Dougherty, Performance of feature-selection methods in the classification of high-dimension data, Pattern Recognition, 42 (2009), 409-424.  doi: 10.1016/j.patcog.2008.08.001.  Google Scholar

[32]

C.-L. Huang and J.-F. Dun, A distributed PSO-SVM hybrid system with feature selection and parameter optimization, Applied Soft Computing, 8 (2008), 1381-1391.  doi: 10.1016/j.asoc.2007.10.007.  Google Scholar

[33]

P. Janik and T. Lobos, Automated classification of power-quality disturbances using SVM and RBF networks, IEEE Transactions on Power Delivery, 21 (2006), 1663-1669.  doi: 10.1109/TPWRD.2006.874114.  Google Scholar

[34]

A. C. Janssens, Y. Deng, G. J. Borsboom, et al., A new logistic regression approach for the evaluation of diagnostic test results, Medical Decision Making, 25 (2005), 168–177. doi: 10.1177/0272989X05275154.  Google Scholar

[35]

Y.-S. JeongB. Kim and Y-D. Ko, Exponentially weighted moving average-based procedure with adaptive thresholding for monitoring nonlinear profiles: Monitoring of plasma etch process in semiconductor manufacturing, Expert Systems with Applications, 40 (2013), 5688-5693.  doi: 10.1016/j.eswa.2013.04.016.  Google Scholar

[36]

S. S. Keerthi and E. G. Gilbert, Convergence of a generalized SMO algorithm for SVM classifier design, Machine Learning, 46 (2002), 351-360.  doi: 10.1023/A:1012431217818.  Google Scholar

[37]

S. J. Kim, H. G. Yoon, K. B. Lee, et al., Hybrid overlay modeling for field-by-field error correction in the photolithography process, IEEE Transactions on Semiconductor Manufacturing, 33 (2020), 53–61. doi: 10.1109/TSM.2019.2957508.  Google Scholar

[38]

B. KimD. W. Kim and G. T. Park, Prediction of plasma etching using a polynomial neural network, IEEE Transactions on Plasma Science, 31 (2003), 1330-1336.  doi: 10.1109/TPS.2003.820681.  Google Scholar

[39]

T. KolodziejczykR. ToscanoS. Fouvry and G. Morales-Espejel, Artificial intelligence as efficient technique for ball bearing fretting wear damage prediction, Wear, 268 (2010), 309-315.  doi: 10.1016/j.wear.2009.08.016.  Google Scholar

[40]

M. Kumar, M. Bhasin, N. K. Natt, et al. BhairPred: prediction of β-hairpins in a protein from multiple alignment information using ANN and SVM techniques, Nucleic Acids Research, 33 (2015), 154–159. doi: 10.1093/nar/gki588.  Google Scholar

[41]

Y.-X. LaiC-F. LaiY-M. Huang and H.-C. Chao, Multi-appliance recognition system with hybrid SVM/GMM classifier in ubiquitous smart home, Information Sciences, 230 (2013), 39-55.  doi: 10.1016/j.ins.2012.10.002.  Google Scholar

[42]

M. Last and A. Kandel, Data Mining for Process and Quality Control in the Semiconductor Industry, 3, Data Mining for Design and Manufacturing, Springer, Boston, MA, 2001. doi: 10.1007/978-1-4757-4911-3_9.  Google Scholar

[43]

H. LiC.-J. LiX.-J. Wu and J. Sun, Statistics-based wrapper for feature selection: An implementation on financial distress identification with support vector machine, Applied Soft Computing, 19 (2014), 57-67.  doi: 10.1016/j.asoc.2014.01.018.  Google Scholar

[44]

S. LiH. WuD. Wan and J. Zhu, An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine, Knowledge-Based Systems, 24 (2011), 40-48.  doi: 10.1016/j.knosys.2010.07.003.  Google Scholar

[45]

Y. Li and X. Zhang, Diffusion maps based k-nearest-neighbor rule technique for semiconductor manufacturing process fault detection, Chemometrics and Intelligent Laboratory Systems, 136 (2014), 47-57.  doi: 10.1016/j.chemolab.2014.05.003.  Google Scholar

[46]

H. Liu and H. Motoda, Feature extraction construction and selection: A data mining perspective, Journal of the American Statistical Association, 94 (1999), 014004. Google Scholar

[47]

J. LongS. Zhang and C. Li, Evolving deep echo state networks for intelligent fault diagnosis, IEEE Transactions on Industrial Informatics, 16 (2020), 4928-4937.  doi: 10.1109/TII.2019.2938884.  Google Scholar

[48]

J. LongZ. SunC. LiY. HongY. Bai and S. Zhang, A novel sparse echo autoencoder network for data-driven fault diagnosis of delta 3-D printers, IEEE Transactions on Instrumentation and Measurement, 69 (2020), 683-692.  doi: 10.1109/TIM.2019.2905752.  Google Scholar

[49]

U. MaulikA. Mukhopadhyay and D. Chakraborty, Gene-expression-based cancer subtypes prediction through feature selection and transductive SVM, IEEE Transactions on Biomedical Engineering, 60 (2013), 1111-1117.  doi: 10.1109/TBME.2012.2225622.  Google Scholar

[50]

M. Melhem, B. Ananou, M. Ouladsine, M. Combal and J. Pinaton, Product quality prediction using alarm data : Application to the semiconductor manufacturing process, 2017 25th Mediterranean Conference on Control and Automation (MED), (2017), 1332–1338. doi: 10.1109/MED.2017.7984303.  Google Scholar

[51]

L. Mönch, J. W. Fowler and S. J. Mason, Production Planning and Control for Semiconductor Wafer Fabrication Facilities: Modeling, Analysis, and Systems, 52, Springer Science & Business Media, Springer, New York, 2012. Google Scholar

[52]

A. Mucherino and L. Liberti, A VNS-based heuristic for feature selection in data mining, Hybrid Metaheuristics, Studies in Computational Intelligence, 434, Springer, Berlin, Heidelberg, 2013,353–368. doi: 10.1007/978-3-642-30671-6_13.  Google Scholar

[53]

J. NeumannC. Schnorr and G. Steidl, Combined SVM-based feature selection and classification, Machine Learning, 61 (2005), 129-150.  doi: 10.1007/s10994-005-1505-9.  Google Scholar

[54]

Y. Oh, K. Ransikarbum, M. Busogi, et al., Adaptive SVM-based real-time quality assessment for primer-sealer dispensing process of sunroof assembly line, Reliability Engineering & System Safety, 184 (2019), 202–212. doi: 10.1016/j.ress.2018.03.020.  Google Scholar

[55]

A. V. PhanM. L. Nguyen and L. T. Bui, Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems, Appl. Intell., 46 (2017), 455-469.  doi: 10.1007/s10489-016-0843-6.  Google Scholar

[56]

H. Purwins, et al., Regression methods for prediction of PECVD silicon nitride layer thickness, 2011 IEEE International Conference on Automation Science and Engineering, (2011), 387–392. doi: 10.1109/CASE.2011.6042426.  Google Scholar

[57]

J. R. Quinlan, Induction on decision tree, Machine Learning, 1 (1986), 81-106.  doi: 10.1007/BF00116251.  Google Scholar

[58]

M. S. RahmanM. K. RahmanM. Kaykobad and M. S. Rahman, isGPT: An optimized model to identify sub-Golgi protein types using SVM and random forest based feature selection, Artificial Intelligence in Medicine, 84 (2017), 90-100.  doi: 10.1016/j.artmed.2017.11.003.  Google Scholar

[59]

I. Rish, An empirical study of the naive Bayes classifier, IJCAI 2001 Workshop on Empirical Methods in Artificial Intelligence, 3 (2001), 41-46.   Google Scholar

[60]

Y-C. SuM.-H. HungF.-T. Cheng and Y-T. Chen, A processing quality prognostics scheme for plasma sputtering in TFT-LCD manufacturing, IEEE Transactions on Semiconductor Manufacturing, 19 (2006), 183-194.  doi: 10.1109/TSM.2006.873514.  Google Scholar

[61]

J. B. TenenbaumV. D. Silva and J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science, 290 (2000), 2319-2323.  doi: 10.1126/science.290.5500.2319.  Google Scholar

[62]

A. UnlerA. Murat and R. B. Chinnam, mr2PSO: A maximum relevance minimum redundancy feature selection method based on swarm intelligence for support vector machine classification, Information Sciences, 181 (2011), 4625-4641.  doi: 10.1016/j.ins.2010.05.037.  Google Scholar

[63]

Y. Wang, Z. Xue, G. Shen, et al., PRINTR: Prediction of RNA binding sites in proteins using SVM and profiles, Amino Acids, 35 (2008), 295–302. doi: 10.1007/s00726-007-0634-9.  Google Scholar

[64]

R. Weber and J. Basak, Simultaneous feature selection and classification using kernel-penalized support vector machines, Information Sciences, 181 (2011), 115-128.  doi: 10.1016/j.ins.2010.08.047.  Google Scholar

[65]

C. Wei, J. Chen, Z. Song and C. Chen, Soft sensors of nonlinear industrial processes based on self-learning kernel regression model, Asian Control Conference, IEEE, (2018), 1783–1788. doi: 10.1109/ASCC.2017.8287444.  Google Scholar

[66]

X. WuL. ChenS. Pang and X. Ding, A paratactic subjective-objective weighting methods and SVM risk assessment model applied in textile and apparel safety, International Journal of Quality & Reliability Management, 32 (2015), 472-485.  doi: 10.1108/IJQRM-06-2013-0102.  Google Scholar

[67]

Y. Xiang and L. Jiang, Water quality prediction using LS-SVM and particle swarm optimization, 2009 Second International Workshop on Knowledge Discovery and Data Mining, (2009), 900–904. doi: 10.1109/WKDD.2009.217.  Google Scholar

[68]

S. D. Xin and C. B. Zhong, Some practical application of sequential analysis to the fault prediction system of a main diesel engine, Conference of the IEEE Industrial Electronics Society, IEEE, 3 (2002), 2151-2156.  doi: 10.1109/IECON.1991.239009.  Google Scholar

[69]

H. Zhang, Q. Li, Z. Sun and Y. Liu, Combining data-driven and model-driven methods for robust facial landmark detection, IEEE Transactions on Information Forensics and Security, (2016), 2409–2422. doi: 10.1109/TIFS.2018.2800901.  Google Scholar

[70]

H. Zhang, Q.-Y. Chen, M-L. Xiang, et al., In silico prediction of mitochondrial toxicity by using GA-CG-SVM approach, Toxicology in Vitro, 23 (2009), 134–140. doi: 10.1016/j.tiv.2008.09.017.  Google Scholar

[71]

C. Zheng and L. Jiao, Automatic parameters selection for SVM based on GA, Intelligent Control & Automation, 2 (2004), 1869-1872.  doi: 10.1109/WCICA.2004.1341000.  Google Scholar

[72]

X. Zhao, E. K. Wong, Y. Wang, et al., A support vector machine (SVM) for predicting preferred treatment position in radiotherapy of patients with breast cancer, Medical Physics, 37 (2010), 5341–5350. doi: 10.1118/1.3483264.  Google Scholar

[73]

Y. Zhu, Y. Tan, Y. Hua, et al., Feature selection and performance evaluation of support vector machine (SVM)-based classifier for differentiating benign and malignant pulmonary nodules by computed tomography, Journal of Digital Imaging, 23 (2010), 51–65. doi: 10.1007/s10278-009-9185-9.  Google Scholar

Figure 1.  Stages of semiconductor manufacturing
Figure 2.  Flowchart of proposed method
Figure 3.  Flowchart of DA-VNS algorithm
Figure 4.  Encoding of DA-VNS algorithm
Figure 5.  Running results after 500 iterations of GA, DA and DA-VNS algorithms respectively (The blue points indicate the projection of the solutions on $ (q(\theta),R(\theta)) $.)
Figure 6.  The evolution of the best $ q(\theta) $ and $ R(\theta) $ for GA, DA and DA-VNS respectively over 500 iterations
Figure 7.  Performance improvement between DA-VNS-SVM and other algorithms
Table 1.  Quality prediction problems in semiconductor manufacturing processes in recent years
Publications Problems Methods Data Driven
Fridgeirsdottir [73] Fault diagnosis Data mining
Kim [37] Prediction of plasma etch processes PNN
Bae [9] Modeling and rule extraction of the ingot fabrication DPNN
Su [59] Quality prognostics for plasma sputtering NN
Chou [16] Prediction of dynamic wafer quality SVM
Purwins [55] Prediction of Silicon Nitride layer thickness Collinearity regression
Melhem [49] Prediction of batch scrap Regularized regression
Alagic [5] Prediction of the damage intensity Image processing and statistical modeling
Al-Kharaz [4] Prediction of quality state ANN
Kim [36] Prediction of wafers errors Ordinary least squares regression and ridge regression
Publications Problems Methods Data Driven
Fridgeirsdottir [73] Fault diagnosis Data mining
Kim [37] Prediction of plasma etch processes PNN
Bae [9] Modeling and rule extraction of the ingot fabrication DPNN
Su [59] Quality prognostics for plasma sputtering NN
Chou [16] Prediction of dynamic wafer quality SVM
Purwins [55] Prediction of Silicon Nitride layer thickness Collinearity regression
Melhem [49] Prediction of batch scrap Regularized regression
Alagic [5] Prediction of the damage intensity Image processing and statistical modeling
Al-Kharaz [4] Prediction of quality state ANN
Kim [36] Prediction of wafers errors Ordinary least squares regression and ridge regression
Table 2.  Common Kernel Function
Kernel function name Kernel function representation
Radial basis function $ \kappa(x_i,x_j)=\exp(-\gamma ||x_i-x_j||) $
Linear kernel function $ \kappa(x_i,x_j)=x_i\cdot x_j $
Polynomial kernel function $ \kappa(x_i,x_j)=(x_i\cdot x_j+1)^d $
Sigmoid kernel function $ \kappa(x_i,x_j)=\tanh[n <x_i\cdot x_j >+\theta] $
Kernel function name Kernel function representation
Radial basis function $ \kappa(x_i,x_j)=\exp(-\gamma ||x_i-x_j||) $
Linear kernel function $ \kappa(x_i,x_j)=x_i\cdot x_j $
Polynomial kernel function $ \kappa(x_i,x_j)=(x_i\cdot x_j+1)^d $
Sigmoid kernel function $ \kappa(x_i,x_j)=\tanh[n <x_i\cdot x_j >+\theta] $
Table 3.  List of preset parameters in DA-VNS
Parameters $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $ $ DA-VNS_{\kappa_{2}} $
Population size 100 100 100
Iteration times 500 500 500
Nearest neighbor number / 5 5
Learning probability 0.8 0.8 0.8
Innovation probability 0.1 0.1 0.1
Mutation probability 0.1 0.1 0.1
Search range of penalty parameter $ C $ $ [10^{-3},10^{3}] $ $ [10^{-3},10^3] $ $ [10^{-3},10^3] $
Search range of kernel width $ \gamma $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $
Search range of amplitude regulating parameter $ t_1 $ / / $ [-10,10] $
Search range of displacement regulating parameter $ t_2 $ / / $ [-10,10] $
Luck coefficient {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5}
$ w_c $, $ w_f $, $ c_{f1} $, $ c_{f2} $ 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2
Parameters $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $ $ DA-VNS_{\kappa_{2}} $
Population size 100 100 100
Iteration times 500 500 500
Nearest neighbor number / 5 5
Learning probability 0.8 0.8 0.8
Innovation probability 0.1 0.1 0.1
Mutation probability 0.1 0.1 0.1
Search range of penalty parameter $ C $ $ [10^{-3},10^{3}] $ $ [10^{-3},10^3] $ $ [10^{-3},10^3] $
Search range of kernel width $ \gamma $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $ $ [2^{-6},2^6] $
Search range of amplitude regulating parameter $ t_1 $ / / $ [-10,10] $
Search range of displacement regulating parameter $ t_2 $ / / $ [-10,10] $
Luck coefficient {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5} {0, 0.01, 0.1, 0.2, 0.5}
$ w_c $, $ w_f $, $ c_{f1} $, $ c_{f2} $ 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2 0.8, 0.2, 0.8, 0.2
Table 4.  Comparison of performance between DA-VNS and other algorithms
Algorithm Optimal parameters $ q(\theta) $ $ R(\theta) $ Selected features
$ C $ $ \gamma $ $ t_1 $ $ t_2 $
$ GA_{RBF} $ 78.54 3.36 / / 0.6136 0.725 60
$ GA_{\kappa_1} $ 73.11 0.80 / / 0.6217 0.7333 47
$ GA_{\kappa_2} $ 11.21 0.94 1.53 4.41 0.6262 0.7416 56
$ GA_{RBF} $ 96.10 0.52 / / 0.6329 0.75 64
$ DA_{\kappa_{1}} $ 62.34 0.49 / / 0.6559 0.775 43
$ DA_{\kappa_{1}} $ 42.04 0.49 9.81 9.96 0.6695 0.7917 41
IG / / / / 0.6957 0.8105 24
$ GA-VNS_{RBF} $ 21.32 1.31 / / 0.6882 0.8033 48
$ DA-VNS{\kappa_{1}} $ 60.82 1.60 / / 0.7038 0.8333 36
$ DA-VNS{\kappa_{2}} $ 96 0.91 -2.69 8.77 0.7221 0.8583 49
Algorithm Optimal parameters $ q(\theta) $ $ R(\theta) $ Selected features
$ C $ $ \gamma $ $ t_1 $ $ t_2 $
$ GA_{RBF} $ 78.54 3.36 / / 0.6136 0.725 60
$ GA_{\kappa_1} $ 73.11 0.80 / / 0.6217 0.7333 47
$ GA_{\kappa_2} $ 11.21 0.94 1.53 4.41 0.6262 0.7416 56
$ GA_{RBF} $ 96.10 0.52 / / 0.6329 0.75 64
$ DA_{\kappa_{1}} $ 62.34 0.49 / / 0.6559 0.775 43
$ DA_{\kappa_{1}} $ 42.04 0.49 9.81 9.96 0.6695 0.7917 41
IG / / / / 0.6957 0.8105 24
$ GA-VNS_{RBF} $ 21.32 1.31 / / 0.6882 0.8033 48
$ DA-VNS{\kappa_{1}} $ 60.82 1.60 / / 0.7038 0.8333 36
$ DA-VNS{\kappa_{2}} $ 96 0.91 -2.69 8.77 0.7221 0.8583 49
Table 5.  Performance improvement of $ q(\theta) $ between DA-VNS and other algorithms. $ Improvement = \frac{q(\theta)-q^{\prime}(\theta)}{q(\theta)}*100\% $
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 12.82 11.66 11.02 10.07 6.81 4.87 1.15 2.22 /
$ DA-VNS{\kappa_{2}} $ 15.03 13.90 13.28 12.35 9.17 7.28 3.66 4.69 2.53
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 12.82 11.66 11.02 10.07 6.81 4.87 1.15 2.22 /
$ DA-VNS{\kappa_{2}} $ 15.03 13.90 13.28 12.35 9.17 7.28 3.66 4.69 2.53
Table 6.  Performance improvement of $ R(\theta) $ between DA-VNS and other algorithms. $ Improvement = \frac{R(\theta)-R^{\prime}(\theta)}{R(\theta)}*100\% $
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 13.00 12.00 11.00 10.00 7.00 4.99 2.74 3.60 /
$ DA-VNS{\kappa_{2}} $ 15.53 14.56 13.60 12.62 9.70 7.76 5.57 6.41 2.91
Improvement $ GA_{RBF} $ $ GA_{\kappa_{1}} $ $ GA_{\kappa_{2}} $ $ DA_{RBF} $ $ DA_{\kappa_{1}} $ $ DA_{\kappa_{2}} $ IG $ DA-VNS_{RBF} $ $ DA-VNS_{\kappa_{1}} $
$ DA-VNS{\kappa_{1}} $ 13.00 12.00 11.00 10.00 7.00 4.99 2.74 3.60 /
$ DA-VNS{\kappa_{2}} $ 15.53 14.56 13.60 12.62 9.70 7.76 5.57 6.41 2.91
Table 7.  Comparison of performance between DA-VNS-SVM and common machine learning algorithms
Algorithm Accuracy
Logistic Regression (LR) 0.4917
Naive Bayes (NB) 0.6167
Artificial Neural Network (ANN) 0.6417
Decision Tree (DT) 0.658
Random Forest (RF) 0.667
DA-VNS$ _{\kappa_1} $-SVM 0.7038
DA-VNS$ _{\kappa_2} $-SVM 0.7221
Algorithm Accuracy
Logistic Regression (LR) 0.4917
Naive Bayes (NB) 0.6167
Artificial Neural Network (ANN) 0.6417
Decision Tree (DT) 0.658
Random Forest (RF) 0.667
DA-VNS$ _{\kappa_1} $-SVM 0.7038
DA-VNS$ _{\kappa_2} $-SVM 0.7221
[1]

Huiqin Zhang, JinChun Wang, Meng Wang, Xudong Chen. Integration of cuckoo search and fuzzy support vector machine for intelligent diagnosis of production process quality. Journal of Industrial & Management Optimization, 2020  doi: 10.3934/jimo.2020150

[2]

Yubo Yuan, Weiguo Fan, Dongmei Pu. Spline function smooth support vector machine for classification. Journal of Industrial & Management Optimization, 2007, 3 (3) : 529-542. doi: 10.3934/jimo.2007.3.529

[3]

Xin Li, Ziguan Cui, Linhui Sun, Guanming Lu, Debnath Narayan. Research on iterative repair algorithm of Hyperchaotic image based on support vector machine. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1199-1218. doi: 10.3934/dcdss.2019083

[4]

Mohamed A. Tawhid, Kevin B. Dsouza. Hybrid binary dragonfly enhanced particle swarm optimization algorithm for solving feature selection problems. Mathematical Foundations of Computing, 2018, 1 (2) : 181-200. doi: 10.3934/mfc.2018009

[5]

Mohammed Abdulrazaq Kahya, Suhaib Abduljabbar Altamir, Zakariya Yahya Algamal. Improving whale optimization algorithm for feature selection with a time-varying transfer function. Numerical Algebra, Control & Optimization, 2021, 11 (1) : 87-98. doi: 10.3934/naco.2020017

[6]

Yubo Yuan. Canonical duality solution for alternating support vector machine. Journal of Industrial & Management Optimization, 2012, 8 (3) : 611-621. doi: 10.3934/jimo.2012.8.611

[7]

K. Schittkowski. Optimal parameter selection in support vector machines. Journal of Industrial & Management Optimization, 2005, 1 (4) : 465-476. doi: 10.3934/jimo.2005.1.465

[8]

Ying Lin, Qi Ye. Support vector machine classifiers by non-Euclidean margins. Mathematical Foundations of Computing, 2020, 3 (4) : 279-300. doi: 10.3934/mfc.2020018

[9]

Jian Luo, Shu-Cherng Fang, Yanqin Bai, Zhibin Deng. Fuzzy quadratic surface support vector machine based on fisher discriminant analysis. Journal of Industrial & Management Optimization, 2016, 12 (1) : 357-373. doi: 10.3934/jimo.2016.12.357

[10]

Ahmad Mousavi, Zheming Gao, Lanshan Han, Alvin Lim. Quadratic surface support vector machine with L1 norm regularization. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021046

[11]

Ning Lu, Ying Liu. Application of support vector machine model in wind power prediction based on particle swarm optimization. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1267-1276. doi: 10.3934/dcdss.2015.8.1267

[12]

Mostafa Abouei Ardakan, A. Kourank Beheshti, S. Hamid Mirmohammadi, Hamed Davari Ardakani. A hybrid meta-heuristic algorithm to minimize the number of tardy jobs in a dynamic two-machine flow shop problem. Numerical Algebra, Control & Optimization, 2017, 7 (4) : 465-480. doi: 10.3934/naco.2017029

[13]

Ye Tian, Wei Yang, Gene Lai, Menghan Zhao. Predicting non-life insurer's insolvency using non-kernel fuzzy quadratic surface support vector machines. Journal of Industrial & Management Optimization, 2019, 15 (2) : 985-999. doi: 10.3934/jimo.2018081

[14]

Jianguo Dai, Wenxue Huang, Yuanyi Pan. A category-based probabilistic approach to feature selection. Big Data & Information Analytics, 2018  doi: 10.3934/bdia.2017020

[15]

Behrouz Kheirfam. A full Nesterov-Todd step infeasible interior-point algorithm for symmetric optimization based on a specific kernel function. Numerical Algebra, Control & Optimization, 2013, 3 (4) : 601-614. doi: 10.3934/naco.2013.3.601

[16]

Wei Fu, Jun Liu, Yirong Lai. Collaborative filtering recommendation algorithm towards intelligent community. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 811-822. doi: 10.3934/dcdss.2019054

[17]

Yunmei Lu, Mingyuan Yan, Meng Han, Qingliang Yang, Yanqing Zhang. Privacy preserving feature selection and Multiclass Classification for horizontally distributed data. Mathematical Foundations of Computing, 2018, 1 (4) : 331-348. doi: 10.3934/mfc.2018016

[18]

Weiping Li, Haiyan Wu, Jie Yang. Intelligent recognition algorithm for social network sensitive information based on classification technology. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1385-1398. doi: 10.3934/dcdss.2019095

[19]

Pooja Louhan, S. K. Suneja. On fractional vector optimization over cones with support functions. Journal of Industrial & Management Optimization, 2017, 13 (2) : 549-572. doi: 10.3934/jimo.2016031

[20]

Florian Dumpert. Quantitative robustness of localized support vector machines. Communications on Pure & Applied Analysis, 2020, 19 (8) : 3947-3956. doi: 10.3934/cpaa.2020174

2019 Impact Factor: 1.366

Article outline

Figures and Tables

[Back to Top]