American Institute of Mathematical Sciences

doi: 10.3934/naco.2021027
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Smooth augmented Lagrangian method for twin bounded support vector machine

 1 Department of Applied Mathematics, Faculty of Mathematical Sciences, University of Guilan, Rasht, Iran 2 Department of Mathematics, Faculty of Science, University of Bojnord, Bojnord, Iran

* Corresponding author: Saeed Ketabchi

Received  September 2020 Revised  June 2021 Early access July 2021

In this paper, we propose a method for solving the twin bounded support vector machine (TBSVM) for the binary classification. To do so, we use the augmented Lagrangian (AL) optimization method and smoothing technique, to obtain new unconstrained smooth minimization problems for TBSVM classifiers. At first, the augmented Lagrangian method is recruited to convert TBSVM into unconstrained minimization programming problems called as AL-TBSVM. We attempt to solve the primal programming problems of AL-TBSVM by converting them into smooth unconstrained minimization problems. Then, the smooth reformulations of AL-TBSVM, which we called AL-STBSVM, are solved by the well-known Newton's algorithm. Finally, experimental results on artificial and several University of California Irvine (UCI) benchmark data sets are provided along with the statistical analysis to show the superior performance of our method in terms of classification accuracy and learning speed.

Citation: Fatemeh Bazikar, Saeed Ketabchi, Hossein Moosaei. Smooth augmented Lagrangian method for twin bounded support vector machine. Numerical Algebra, Control and Optimization, doi: 10.3934/naco.2021027
References:
 [1] Z. Arabasadi, R. Alizadehsani, M. Roshanzamir, H. Moosaei and A. Yarifard, Computer aided decision making for heart disease detection using hybrid neural network-Genetic algorithm, Computer Methods and Programs in Biomedicine, 141 (2017), 19-26. [2] F. Bazikar, S. Ketabchi and H. Moosaei, DC programming and DCA for parametric-margin $\nu-$support vector machine, Applied Intelligence, (2020), 1–12. [3] D. P. Bertsekas, Nonlinear Programming, Belmont, 1995. [4] E. G. Birgin and J. M. Martinez, Practical Augmented Lagrangian Methods for Constrained Optimization, Society for Industrial and Applied Mathematics, 2014. doi: 10.1137/1.9781611973365. [5] C. C. Chang and C. J. Lin, LIBSVM: A library for support vector machines, ACM transactions on intelligent systems and technology (TIST), 2 (2011), 1-27. [6] C. Chen and O. L. Mangasarian, A class of smoothing functions for nonlinear and mixed complementarity problems, Computational Optimization and Applications, 5 (1996), 97-138.  doi: 10.1007/BF00249052. [7] C. Chen and O. L. Mangasarian, Smoothing methods for convex inequalities and linear complementarity problems, Mathematical Programming, 71 (1995), 51-69.  doi: 10.1007/BF01592244. [8] J. Demsar, Statistical comparisons of classifiers over multiple data sets, Journal of Machine Learning Research, (2006), 1–30. [9] A. Goli, E. Moeini, A. M. Shafiee, M. Zamani and E. Touti, Application of improved artificial intelligence with runner-root meta-heuristic algorithm for dairy products industry: a case study, International Journal on Artificial Intelligence Tools, 29 (2020), 1-30. [10] A. Goli, H. K. Zare, R. T. Moghaddam and A. Sadeghieh, An improved artificial intelligence based on gray wolf optimization and cultural algorithm to predict demand for dairy products: a case study, IJIMAI, 5 (2019), 15-22. [11] A. Goli, H. K. Zare, R. Tavakkoli-Moghaddam and A. Sadeghieh, Application of robust optimization for a product portfolio problem using an invasive weed optimization algorithm, Numerical Algebra, Control & Optimization, 9 (2019), 187-209.  doi: 10.3934/naco.2019014. [12] A. Goli, H. K. Zare, R. Tavakkoli-Moghaddam and A. Sadeghieh, Hybrid artificial intelligence and robust optimization for a multi-objective product portfolio problem Case study: The dairy products industry, Computers & Industrial Engineering, 9 (2019), ID: 106090. doi: 10.1016/j.cie.2019.106090. [13] R. L. Iman and J. M. Davenport, Approximations of the critical region of the fbietkan statistic, Communications in Statistics-Theory and Methods, 9 (1980), 571-595. [14] S. Jafarian-Namin, A. Goli, M. Qolipour, M. Mostafaeipour and A. M. Golmohammadi, Forecasting the wind power generation using Box-Jenkins and hybrid artificial intelligence, International Journal of Energy Sector Management, 13 (2019), 1038-1062.  doi: 10.1108/IJESM-06-2018-0002. [15] H. Javadi, H. Moosaei and D. Ciuonzo, Learning wireless sensor networks for source localization, Sensors, 19 (2019), 635. [16] R. K. Jayadeva, R. Khemchandani and S. Chandra, Twin support vector machines for pattern classification, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29 (2007), 905-910.  doi: 10.1142/9789812813220_0009. [17] H. Ju, Q. Hou and L. Jing, Fuzzy and interval-valued fuzzy nonparallel support vector machine, Journal of Intelligent and Fuzzy Systems, 36 (2019), 2677-2690. [18] S. Ketabchi and H. Moosaei, Minimum norm solution to the absolute value equation in the convex case, Journal of Optimization Theory and Applications, 154 (2012), 1080-1087.  doi: 10.1007/s10957-012-0044-3. [19] S. Ketabchi, H. Moosaei, M. Razzaghi and P. Pardalos, An improvement on parametric $\nu$-support vector algorithm for classification, Ann. Oper. Res., 276 (2019), 155-168.  doi: 10.1007/s10479-017-2724-8. [20] S. Ketabchi and M. Behboodi-Kahoo, Smoothing techniques and augmented Lagrangian method for recourse problem of two-stage stochastic linear programming, Journal of Applied Mathematics, (2013), Article ID: 735916. doi: 10.1155/2013/735916. [21] S. Ketabchi and M. Behboodi-Kahoo, Augmented Lagrangian method for recourse problem of two-stage stochastic linear programming, Kybernetika, 49 (2013), 188-198. [22] R. Khemchandani, P. Saigal and S. Chandra, Improvements on $\nu$-twin support vector machine, Neural Networks, 79 (2016), 97-107. [23] M. A. Kumar and M. Gopal, Application of smoothing technique on twin support vector machines, Pattern Recognition Letters, 29 (2008), 1842-1848.  doi: 10.1007/978-1-84996-098-4. [24] Y. J. Lee and O. L. Mangasarian, SSVM: A smooth support vector machine for classification, Computational Optimization and Applications, 20 (2001), 5-22.  doi: 10.1023/A:1011215321374. [25] H. Moosaei, S. Ketabchi, M. Razzaghi and and M. Tanveer, Generalized twin support vector machines, Neural Processing Letters, 53 (2021), 1545-1564. [26] M. Lichman, UCI Machine Learning Repository, Irvine, CA: University of California, School of Information and Computer Science, 2013. [27] W. Noble, S.William and others, Support vector machine applications in computational biology, Kernel Methods in Computational Biology, 71 (2004), 92. [28] R. Lotfi, Y. Z. Mehrjerdi, M. S. Pishvaee, A. Sadeghieh and G. W. Weber, A robust optimization model for sustainable and resilient closed-loop supply chain network design considering conditional value at risk, Numerical Algebra, Control and Optimization, 11 (2021), 221-253.  doi: 10.3934/naco.2020023. [29] R. Lotfi, Z. Yadegari, S. H. Hosseini, A. H. Khameneh, E. B. Tirkolaee and G. W. Weber, A robust time-cost-quality-energy-environment trade-off with resource-constrained in project management: A case study for a bridge construction project, Journal of Industrial and Management Optimization, Online. doi: 10.3934/jimo.2020158. [30] O. L. Mangasarian and E. W. Wild, Multisurface proximal support vector machine classification via generalized eigenvalues, IEEE Trans. Pattern Anal. Mach. Intell., 28 (2005), 69-74. [31] O. L. Mangasarian and E. W. Wild, Proximal support vector machine classifiers, In Proceedings KDD-2001: Knowledge Discovery and Data Mining, 2001. [32] J. Nocedal and S. J. Wright, Numerical Optimization, Springer Series in Operations Research, Springer-Verlag, New York, 1999. doi: 10.1007/b98874. [33] X. Peng, A $\nu$-twin support vector machine ($\nu$-TSVM) classifier and its geometric algorithms, Information Sciences, 180 (2010), 3863-3875.  doi: 10.1016/j.ins.2010.06.039. [34] C. Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods, (1999), 185–208. [35] Y. H. Shao, C. H. Zhang, X. B. Wang and N. Y. Deng, Improvements on twin support vector machines, IEEE Transactions on Neural Networks, 22 (2011), 962-968.  doi: 10.1109/TNNLS.2014.2379930. [36] Y. Tian and Z. Qi, Review on: twin support vector machines, Annals of Data Science, 1 (2014), 253-277.  doi: 10.1007/s40745-014-0018-4. [37] Y. Tian, X. Ju, Z. Qi and Y. Shi, Improved twin support vector machine, Science China Mathematics, 57 (2014), 417-432.  doi: 10.1007/s11425-013-4718-6. [38] H. Wang, Z. Zhou and Y. Xu, An improved $\nu$-twin bounded support vector machine, Appl. Intell., 48 (2018), 1041-1053. [39] Y. Wang, T. Wang and J. Bu, Color image segmentation using pixel wise support vector machine classification, Pattern Recognition, 44 (2011), 777-787. [40] Y. Yan and Q. Li, An efficient augmented Lagrangian method for support vector machine, Optimization Methods and Software, 35 (2020), 855-883.  doi: 10.1080/10556788.2020.1734002.

show all references

References:
 [1] Z. Arabasadi, R. Alizadehsani, M. Roshanzamir, H. Moosaei and A. Yarifard, Computer aided decision making for heart disease detection using hybrid neural network-Genetic algorithm, Computer Methods and Programs in Biomedicine, 141 (2017), 19-26. [2] F. Bazikar, S. Ketabchi and H. Moosaei, DC programming and DCA for parametric-margin $\nu-$support vector machine, Applied Intelligence, (2020), 1–12. [3] D. P. Bertsekas, Nonlinear Programming, Belmont, 1995. [4] E. G. Birgin and J. M. Martinez, Practical Augmented Lagrangian Methods for Constrained Optimization, Society for Industrial and Applied Mathematics, 2014. doi: 10.1137/1.9781611973365. [5] C. C. Chang and C. J. Lin, LIBSVM: A library for support vector machines, ACM transactions on intelligent systems and technology (TIST), 2 (2011), 1-27. [6] C. Chen and O. L. Mangasarian, A class of smoothing functions for nonlinear and mixed complementarity problems, Computational Optimization and Applications, 5 (1996), 97-138.  doi: 10.1007/BF00249052. [7] C. Chen and O. L. Mangasarian, Smoothing methods for convex inequalities and linear complementarity problems, Mathematical Programming, 71 (1995), 51-69.  doi: 10.1007/BF01592244. [8] J. Demsar, Statistical comparisons of classifiers over multiple data sets, Journal of Machine Learning Research, (2006), 1–30. [9] A. Goli, E. Moeini, A. M. Shafiee, M. Zamani and E. Touti, Application of improved artificial intelligence with runner-root meta-heuristic algorithm for dairy products industry: a case study, International Journal on Artificial Intelligence Tools, 29 (2020), 1-30. [10] A. Goli, H. K. Zare, R. T. Moghaddam and A. Sadeghieh, An improved artificial intelligence based on gray wolf optimization and cultural algorithm to predict demand for dairy products: a case study, IJIMAI, 5 (2019), 15-22. [11] A. Goli, H. K. Zare, R. Tavakkoli-Moghaddam and A. Sadeghieh, Application of robust optimization for a product portfolio problem using an invasive weed optimization algorithm, Numerical Algebra, Control & Optimization, 9 (2019), 187-209.  doi: 10.3934/naco.2019014. [12] A. Goli, H. K. Zare, R. Tavakkoli-Moghaddam and A. Sadeghieh, Hybrid artificial intelligence and robust optimization for a multi-objective product portfolio problem Case study: The dairy products industry, Computers & Industrial Engineering, 9 (2019), ID: 106090. doi: 10.1016/j.cie.2019.106090. [13] R. L. Iman and J. M. Davenport, Approximations of the critical region of the fbietkan statistic, Communications in Statistics-Theory and Methods, 9 (1980), 571-595. [14] S. Jafarian-Namin, A. Goli, M. Qolipour, M. Mostafaeipour and A. M. Golmohammadi, Forecasting the wind power generation using Box-Jenkins and hybrid artificial intelligence, International Journal of Energy Sector Management, 13 (2019), 1038-1062.  doi: 10.1108/IJESM-06-2018-0002. [15] H. Javadi, H. Moosaei and D. Ciuonzo, Learning wireless sensor networks for source localization, Sensors, 19 (2019), 635. [16] R. K. Jayadeva, R. Khemchandani and S. Chandra, Twin support vector machines for pattern classification, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29 (2007), 905-910.  doi: 10.1142/9789812813220_0009. [17] H. Ju, Q. Hou and L. Jing, Fuzzy and interval-valued fuzzy nonparallel support vector machine, Journal of Intelligent and Fuzzy Systems, 36 (2019), 2677-2690. [18] S. Ketabchi and H. Moosaei, Minimum norm solution to the absolute value equation in the convex case, Journal of Optimization Theory and Applications, 154 (2012), 1080-1087.  doi: 10.1007/s10957-012-0044-3. [19] S. Ketabchi, H. Moosaei, M. Razzaghi and P. Pardalos, An improvement on parametric $\nu$-support vector algorithm for classification, Ann. Oper. Res., 276 (2019), 155-168.  doi: 10.1007/s10479-017-2724-8. [20] S. Ketabchi and M. Behboodi-Kahoo, Smoothing techniques and augmented Lagrangian method for recourse problem of two-stage stochastic linear programming, Journal of Applied Mathematics, (2013), Article ID: 735916. doi: 10.1155/2013/735916. [21] S. Ketabchi and M. Behboodi-Kahoo, Augmented Lagrangian method for recourse problem of two-stage stochastic linear programming, Kybernetika, 49 (2013), 188-198. [22] R. Khemchandani, P. Saigal and S. Chandra, Improvements on $\nu$-twin support vector machine, Neural Networks, 79 (2016), 97-107. [23] M. A. Kumar and M. Gopal, Application of smoothing technique on twin support vector machines, Pattern Recognition Letters, 29 (2008), 1842-1848.  doi: 10.1007/978-1-84996-098-4. [24] Y. J. Lee and O. L. Mangasarian, SSVM: A smooth support vector machine for classification, Computational Optimization and Applications, 20 (2001), 5-22.  doi: 10.1023/A:1011215321374. [25] H. Moosaei, S. Ketabchi, M. Razzaghi and and M. Tanveer, Generalized twin support vector machines, Neural Processing Letters, 53 (2021), 1545-1564. [26] M. Lichman, UCI Machine Learning Repository, Irvine, CA: University of California, School of Information and Computer Science, 2013. [27] W. Noble, S.William and others, Support vector machine applications in computational biology, Kernel Methods in Computational Biology, 71 (2004), 92. [28] R. Lotfi, Y. Z. Mehrjerdi, M. S. Pishvaee, A. Sadeghieh and G. W. Weber, A robust optimization model for sustainable and resilient closed-loop supply chain network design considering conditional value at risk, Numerical Algebra, Control and Optimization, 11 (2021), 221-253.  doi: 10.3934/naco.2020023. [29] R. Lotfi, Z. Yadegari, S. H. Hosseini, A. H. Khameneh, E. B. Tirkolaee and G. W. Weber, A robust time-cost-quality-energy-environment trade-off with resource-constrained in project management: A case study for a bridge construction project, Journal of Industrial and Management Optimization, Online. doi: 10.3934/jimo.2020158. [30] O. L. Mangasarian and E. W. Wild, Multisurface proximal support vector machine classification via generalized eigenvalues, IEEE Trans. Pattern Anal. Mach. Intell., 28 (2005), 69-74. [31] O. L. Mangasarian and E. W. Wild, Proximal support vector machine classifiers, In Proceedings KDD-2001: Knowledge Discovery and Data Mining, 2001. [32] J. Nocedal and S. J. Wright, Numerical Optimization, Springer Series in Operations Research, Springer-Verlag, New York, 1999. doi: 10.1007/b98874. [33] X. Peng, A $\nu$-twin support vector machine ($\nu$-TSVM) classifier and its geometric algorithms, Information Sciences, 180 (2010), 3863-3875.  doi: 10.1016/j.ins.2010.06.039. [34] C. Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernel Methods, (1999), 185–208. [35] Y. H. Shao, C. H. Zhang, X. B. Wang and N. Y. Deng, Improvements on twin support vector machines, IEEE Transactions on Neural Networks, 22 (2011), 962-968.  doi: 10.1109/TNNLS.2014.2379930. [36] Y. Tian and Z. Qi, Review on: twin support vector machines, Annals of Data Science, 1 (2014), 253-277.  doi: 10.1007/s40745-014-0018-4. [37] Y. Tian, X. Ju, Z. Qi and Y. Shi, Improved twin support vector machine, Science China Mathematics, 57 (2014), 417-432.  doi: 10.1007/s11425-013-4718-6. [38] H. Wang, Z. Zhou and Y. Xu, An improved $\nu$-twin bounded support vector machine, Appl. Intell., 48 (2018), 1041-1053. [39] Y. Wang, T. Wang and J. Bu, Color image segmentation using pixel wise support vector machine classification, Pattern Recognition, 44 (2011), 777-787. [40] Y. Yan and Q. Li, An efficient augmented Lagrangian method for support vector machine, Optimization Methods and Software, 35 (2020), 855-883.  doi: 10.1080/10556788.2020.1734002.
Illustration of TWSVM
Results of linear TWSVM, TBSVM, I$\nu$-TBSVM and AL-STBSVM on generated data set
Descriptions of the data sets from the UCI repository
 Data set $\#$ Cases $\#$ Features $\#$ Classes Source Sonar 208 60 2 UCI Cancer 699 9 2 UCI Diabet 768 8 2 UCI Wdbc 569 30 2 UCI Ionosphere 351 34 2 UCI Australian 690 14 2 UCI Heart 270 14 2 UCI Haberman 306 3 2 UCI German 1000 24 2 UCI House Votes 435 16 2 UCI Spect 237 22 2 UCI Splice 1000 60 2 UCI Lung-cancer 32 56 2 UCI F-diagnosis 100 9 2 UCI Breast-cancer 116 9 2 UCI Bupa 345 6 2 UCI Pima 768 9 2 UCI Housing 506 14 2 UCI
 Data set $\#$ Cases $\#$ Features $\#$ Classes Source Sonar 208 60 2 UCI Cancer 699 9 2 UCI Diabet 768 8 2 UCI Wdbc 569 30 2 UCI Ionosphere 351 34 2 UCI Australian 690 14 2 UCI Heart 270 14 2 UCI Haberman 306 3 2 UCI German 1000 24 2 UCI House Votes 435 16 2 UCI Spect 237 22 2 UCI Splice 1000 60 2 UCI Lung-cancer 32 56 2 UCI F-diagnosis 100 9 2 UCI Breast-cancer 116 9 2 UCI Bupa 345 6 2 UCI Pima 768 9 2 UCI Housing 506 14 2 UCI
Comparison of linear TWSVM, TBSVM, I$\nu$-TBSVM and proposed model (AL-STBSVM) on UCI benchmark data sets
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM size Acc($\%$), Time(s) $c_{1}=c_{2}$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$ Acc($\%$), Time(s) $c_{1}=c_{2}$, $\nu$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$ Sonar 76.88, 0.73 77.45, 1.53 70.29, 1.53 79.14, 0.52 208$\times$ 60 $2^{6}$ $2^{6}$, $2^{-3}$ $2^{6}$, 0.2 1, $2^{-8}$ Cancer 96.13, 1.63 96.28, 2.63 90.13, 2.28 96.15, 1.99 699$\times$9 $2^{-2}$ $2^{-5},2^{-3}$ 1, 0.1 $2^{-5},2^{-8}$ Diabet 69.13, 1.90 73.58, 2.68 62.24, 3.39 71.74, 27.35 768$\times$8 $2^{3}$ $2^{5},2^{-4}$ $2^{4}$, 0.5 $2^{3},2^{-7}$ Wdbc 93.66, 1.6 94.74, 2.37 90.87, 2 95.62, 6.16 569$\times$30 $2^{6}$ $2^{3},2^{-3}$ $2^{5}$, 0.8 $2^{-5},2^{-4}$ Ionosphere 84.89, 0.86 86.30, 1.71 84.03, 1.67 85.21, 0.62 351$\times$34 $2^{-3}$ $2^{-2},2^{-2}$ $2^{-2}$, 0.7 $2^{3},2^{-7}$ Australian 83.05, 1.62 84.05, 2.68 67.66, 3.30 83.90, 2.93 690 $\times$ 14 $2^{5}$ $2^{5}$, 1 $2^{3}$, 0.6 $2,2^{4}$ Heart 84.44, 0.98 84.44, 1.57 83.70, 1.64 85.19, 0.63 270$\times$14 $2^{2}$ $2,2^{3}$ $2^{2}$, 0.9 $2^{5},2^{-7}$ Haberman 74.51, 0.86 75.91, 1.64 72.91, 1.66 77.05, 0.55 306$\times$3 $2^{-2}$ $2^{3},2^{2}$ 1, 0.6 $2^{4},2^{4}$ German 73.9, 1.78 75.4, 3.94 61.8, 6.59 75.1, 11.42 1000$\times$24 $2^{-2}$ $2^{-5},2^{-3}$ $2^{-3}$, 0.4 $2^{10},2^{2}$ House Votes 95.62, 0.85 95.85, 1.90 93.11, 1.67 96.08, 0.65 435$\times$16 $2^{8}$ $2,2^{-3}$ 1, 0.1 $2^{2},2^{2}$ Spect 68.60, 0.75 70.44, 1.59 73.57, 1.6 71.24, 0.55 237$\times$22 $2^{-5}$ $2^{-5},2^{-6}$ $2^{9}$, 0.9 $2^{3},2^{4}$ Splice 75.40, 2.44 79.70, 3.77 78.20, 5.63 79.18, 10.19 1000$\times$60 $2^{-5}$ $2^{-5},2^{-6}$ $2^{-5}$, 0.7 $2^{5},2^{4}$ Lung-cancer 82.5, 0.62 82.5, 1.40 85, 1.42 85.83, 0.42 32$\times$56 $2^{-1}$ $2^{-2},2^{-3}$ $2^{-3}$, 0.5 $2^{6},2^{-8}$ F-diagnosis 76.49, 0.62 82.34, 1.40 68.35, 1.48 75.01, 0.49 100$\times$9 $2^{5}$ $2^{8},2^{5}$ $2^{10}$, 0.2 $2^{4},2^{-7}$ Breast-cancer 72.52, 0.61 71.84, 1.38 57.04, 1.52 73.81, 0.45 116$\times$9 $2^{6}$ $2^{3},2^{-2}$ $2^{6}$, 0.9 $2^{5},2^{-7}$ Bupa 64.35, 0.73 65.23, 1.69 69.29, 1.72 66.68, 0.58 345$\times$6 $2^{-4}$ $2^{-3}$, 2 $2^{-4}$, 0.2 $2^{2},2^{4}$ Pima 71.36, 0.86 73.30, 1.64 64.72, 2.65 73.62, 2.04 768$\times$9 $2^{-7}$ $2^{-5},2^{-3}$ $2^{-7}$, 0.2 $2^{2},2^{4}$ Housing 80.08, 0.96 80.24, 2.89 61.11, 2.01 93.09, 1.5 506$\times$14 $2^{-8}$ $2^{-7},2^{-6}$ $2^{-7}$, 0.2 $2^{6},2^{-8}$ Avg.acc 79.08 80.53 74.11 83.31
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM size Acc($\%$), Time(s) $c_{1}=c_{2}$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$ Acc($\%$), Time(s) $c_{1}=c_{2}$, $\nu$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$ Sonar 76.88, 0.73 77.45, 1.53 70.29, 1.53 79.14, 0.52 208$\times$ 60 $2^{6}$ $2^{6}$, $2^{-3}$ $2^{6}$, 0.2 1, $2^{-8}$ Cancer 96.13, 1.63 96.28, 2.63 90.13, 2.28 96.15, 1.99 699$\times$9 $2^{-2}$ $2^{-5},2^{-3}$ 1, 0.1 $2^{-5},2^{-8}$ Diabet 69.13, 1.90 73.58, 2.68 62.24, 3.39 71.74, 27.35 768$\times$8 $2^{3}$ $2^{5},2^{-4}$ $2^{4}$, 0.5 $2^{3},2^{-7}$ Wdbc 93.66, 1.6 94.74, 2.37 90.87, 2 95.62, 6.16 569$\times$30 $2^{6}$ $2^{3},2^{-3}$ $2^{5}$, 0.8 $2^{-5},2^{-4}$ Ionosphere 84.89, 0.86 86.30, 1.71 84.03, 1.67 85.21, 0.62 351$\times$34 $2^{-3}$ $2^{-2},2^{-2}$ $2^{-2}$, 0.7 $2^{3},2^{-7}$ Australian 83.05, 1.62 84.05, 2.68 67.66, 3.30 83.90, 2.93 690 $\times$ 14 $2^{5}$ $2^{5}$, 1 $2^{3}$, 0.6 $2,2^{4}$ Heart 84.44, 0.98 84.44, 1.57 83.70, 1.64 85.19, 0.63 270$\times$14 $2^{2}$ $2,2^{3}$ $2^{2}$, 0.9 $2^{5},2^{-7}$ Haberman 74.51, 0.86 75.91, 1.64 72.91, 1.66 77.05, 0.55 306$\times$3 $2^{-2}$ $2^{3},2^{2}$ 1, 0.6 $2^{4},2^{4}$ German 73.9, 1.78 75.4, 3.94 61.8, 6.59 75.1, 11.42 1000$\times$24 $2^{-2}$ $2^{-5},2^{-3}$ $2^{-3}$, 0.4 $2^{10},2^{2}$ House Votes 95.62, 0.85 95.85, 1.90 93.11, 1.67 96.08, 0.65 435$\times$16 $2^{8}$ $2,2^{-3}$ 1, 0.1 $2^{2},2^{2}$ Spect 68.60, 0.75 70.44, 1.59 73.57, 1.6 71.24, 0.55 237$\times$22 $2^{-5}$ $2^{-5},2^{-6}$ $2^{9}$, 0.9 $2^{3},2^{4}$ Splice 75.40, 2.44 79.70, 3.77 78.20, 5.63 79.18, 10.19 1000$\times$60 $2^{-5}$ $2^{-5},2^{-6}$ $2^{-5}$, 0.7 $2^{5},2^{4}$ Lung-cancer 82.5, 0.62 82.5, 1.40 85, 1.42 85.83, 0.42 32$\times$56 $2^{-1}$ $2^{-2},2^{-3}$ $2^{-3}$, 0.5 $2^{6},2^{-8}$ F-diagnosis 76.49, 0.62 82.34, 1.40 68.35, 1.48 75.01, 0.49 100$\times$9 $2^{5}$ $2^{8},2^{5}$ $2^{10}$, 0.2 $2^{4},2^{-7}$ Breast-cancer 72.52, 0.61 71.84, 1.38 57.04, 1.52 73.81, 0.45 116$\times$9 $2^{6}$ $2^{3},2^{-2}$ $2^{6}$, 0.9 $2^{5},2^{-7}$ Bupa 64.35, 0.73 65.23, 1.69 69.29, 1.72 66.68, 0.58 345$\times$6 $2^{-4}$ $2^{-3}$, 2 $2^{-4}$, 0.2 $2^{2},2^{4}$ Pima 71.36, 0.86 73.30, 1.64 64.72, 2.65 73.62, 2.04 768$\times$9 $2^{-7}$ $2^{-5},2^{-3}$ $2^{-7}$, 0.2 $2^{2},2^{4}$ Housing 80.08, 0.96 80.24, 2.89 61.11, 2.01 93.09, 1.5 506$\times$14 $2^{-8}$ $2^{-7},2^{-6}$ $2^{-7}$, 0.2 $2^{6},2^{-8}$ Avg.acc 79.08 80.53 74.11 83.31
Comparison of nonlinear TWSVM, TBSVM, I$\nu$-TBSVM and proposed model (AL-STBSVM) on UCI benchmark data sets
 Dataset TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM size Acc($\%$), Time(s) $c_{1}=c_{2}$, $\gamma$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$, $\gamma$ Acc($\%$), Time(s) $c_{1}=c_{2}$, $\nu$, $\gamma$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$, $\gamma$ Sonar 84.53, 0.80 86.14, 1.62 83.27, 1.67 87.54, 0.84 208$\times$ 60 $2^{5},2^{-6}$ $2^{6},2^{-3},2^{-3}$ $2^{5}$, 0.2, $2^{-3}$ $2^{2},2^{-8},2^{-6}$ Cancer 96.42, 4.71 96.42, 6.59 96.02, 2.90 96.85, 3.85 699$\times$9 $2^{-2},2^{-6}$ $2^{-3},2^{-3},2^{-6}$ $2$, 0.1, $2^{-4}$ $2^{-5},2^{-8},2^{-3}$ Diabet 65.49, 10.18 66.28, 8.21 65.11, 5.33 69.14, 21.28 768$\times$8 $2^{5},2^{-6}$ $2^{5},2^{-4},2^{-6}$ $2^{4}$, 0.4, $2^{-3}$ $2^{5},2^{-6},2^{-3}$ Wdbc 62.74, 2.64 62.74, 3.53 91.76, 2.17 87.18, 10.69 569$\times$30 $2^{5},2^{-6}$ $2^{4},2^{-3},2^{-6}$ $2^{-9}$, 0.3, $2^{-8}$ $2^{-5},2^{-6},2^{-1}$ Ionosphere 94.89, 1.42 94.54, 2.15 87.17, 1.88 94.89, 2.96 351$\times$34 $2^{3},2^{-6}$ $2^{-4},2^{-2},2^{-3}$ $2^{-2}$, 0.2, $2^{-2}$ $2^{4},2^{-6},2^{-6}$ Australian 55.65, 4.89 55.22, 5.47 55.51, 6.94 65.21, 17.95 690$\times$14 $2^{3},2^{-3}$ $2^{5},2,2^{-6}$ $2^{3}$, 0.5, $2^{-3}$ $2^{2},2^{4},2^{-5}$ Heart 82.59, 0.92 83.33, 1.86 81.48, 1.76 83.33, 1.58 270$\times$14 $2^{2},2^{-3}$ $2,2,2^{-6}$ $2^{2}$, 0.9, $2^{9}$ $2^{5},2^{-7},2^{-6}$ Haberman 73.85, 1.28 73.52, 2.01 73.19, 1.88 73.52, 2.67 306$\times$3 $2^{-1},2^{-6}$ $2,2,2^{-6}$ $1$, 0.6, $2^{9}$ $2^{3},2^{4},2^{-6}$ German 70.1, 14.69 70.2, 14.34 70, 7.33 71.5, 35.33 1000$\times$24 $2^{-2},2^{-6}$ $2^{-4},2^{-4},2^{-6}$ $2^{-3}$, 0.4, $2^{7}$ $2^{9},2^{4},2^{-6}$ House Votes 92.64, 1.22 93.55, 2.20 91.70, 1.78 94.71, 4.12 435$\times$16 $2^{8},2^{-6}$ $2,2^{-2},2^{-6}$ $1$, 0.1, $2^{-5}$ $2^{3},2^{4},2^{-3}$ Spect 71.89, 0.86 73.76, 1.66 68.90, 1.66 74.17, 1.45 237$\times$22 $2^{-5},2^{-6}$ $2^{-3},2^{-6},2^{-4}$ $2^{-7}$, 0.6, $2^{-5}$ $2^{3},2^{4},2^{-1}$ Splice 76.71, 15.98 75.20, 16.58 74.41, 14.66 87.20, 42.10 1000$\times$60 $2^{-6},2^{-6}$ $2^{-5},2^{-5},2^{-6}$ $2^{-4}$, 0.2, $2^{-3}$ $2^{5},2^{5},2^{-6}$ Lung-caner 80.83, 0.67 85.83, 1.41 84.16, 1.51 85, 0.35 32$\times$56 $2,2^{-6}$ $2^{-2},2^{-3},2^{-6}$ $2^{-5}$, 0.1, $2^{-3}$ $2^{5},2^{-8},2^{-6}$ F-diagnosis 88.14, 0.64 88.14, 1.42 87.16, 1.59 89.25, 0.44 100$\times$9 $2^{5},2^{-3}$ $2^{6},2^{5},2^{-3}$ $2^{10}$, 0.2, $2^{7}$ $2^{4},2^{-7},2^{-2}$ Breast-cancer 55.16, 0.65 55.16, 1.48 55.15, 1.56 60.31, 0.48 116$\times$9 $2^{5},2^{-3}$ $2^{3},2^{-5},2^{-3}$ $2^{7}$, 0.9, $2^{-6}$ $2^{4},2^{-7},2^{-6}$ Bupa 64.37, 1.19 64.35, 2.02 65.80, 1.94 66.36, 2.62 345$\times$6 $2^{-4},2^{-6}$ $2^{3},2,2^{-3}$ $2^{-5}$, 0.2, $2^{-3}$ $2^{3},2^{4},2^{-6}$ Pima 65.63, 9.66 66.15, 8.08 65.12, 4.74 69, 20.07 768$\times$9 $2^{-3},2^{-6}$ $2^{-6},2^{-6},2^{-6}$ $2^{-3}$, 0.6, $2^{-6}$ $2^{2},2^{5},2^{-3}$ Housing 93.09, 2.12 93.09, 3.65 93.09, 4.30 93.09, 7.01 506$\times$14 $2^{-6},2^{6}$ $2^{-5},2^{-3},2^{7}$ $2^{10}$, 0.7, $2^{7}$ $2^{7},2^{-8},2^{-6}$ Avg.acc 76.37 76.86 77.16 80.46
 Dataset TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM size Acc($\%$), Time(s) $c_{1}=c_{2}$, $\gamma$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$, $\gamma$ Acc($\%$), Time(s) $c_{1}=c_{2}$, $\nu$, $\gamma$ Acc($\%$), Time(s) $c_{3}=c_{4}$, $c_{1}=c_{2}$, $\gamma$ Sonar 84.53, 0.80 86.14, 1.62 83.27, 1.67 87.54, 0.84 208$\times$ 60 $2^{5},2^{-6}$ $2^{6},2^{-3},2^{-3}$ $2^{5}$, 0.2, $2^{-3}$ $2^{2},2^{-8},2^{-6}$ Cancer 96.42, 4.71 96.42, 6.59 96.02, 2.90 96.85, 3.85 699$\times$9 $2^{-2},2^{-6}$ $2^{-3},2^{-3},2^{-6}$ $2$, 0.1, $2^{-4}$ $2^{-5},2^{-8},2^{-3}$ Diabet 65.49, 10.18 66.28, 8.21 65.11, 5.33 69.14, 21.28 768$\times$8 $2^{5},2^{-6}$ $2^{5},2^{-4},2^{-6}$ $2^{4}$, 0.4, $2^{-3}$ $2^{5},2^{-6},2^{-3}$ Wdbc 62.74, 2.64 62.74, 3.53 91.76, 2.17 87.18, 10.69 569$\times$30 $2^{5},2^{-6}$ $2^{4},2^{-3},2^{-6}$ $2^{-9}$, 0.3, $2^{-8}$ $2^{-5},2^{-6},2^{-1}$ Ionosphere 94.89, 1.42 94.54, 2.15 87.17, 1.88 94.89, 2.96 351$\times$34 $2^{3},2^{-6}$ $2^{-4},2^{-2},2^{-3}$ $2^{-2}$, 0.2, $2^{-2}$ $2^{4},2^{-6},2^{-6}$ Australian 55.65, 4.89 55.22, 5.47 55.51, 6.94 65.21, 17.95 690$\times$14 $2^{3},2^{-3}$ $2^{5},2,2^{-6}$ $2^{3}$, 0.5, $2^{-3}$ $2^{2},2^{4},2^{-5}$ Heart 82.59, 0.92 83.33, 1.86 81.48, 1.76 83.33, 1.58 270$\times$14 $2^{2},2^{-3}$ $2,2,2^{-6}$ $2^{2}$, 0.9, $2^{9}$ $2^{5},2^{-7},2^{-6}$ Haberman 73.85, 1.28 73.52, 2.01 73.19, 1.88 73.52, 2.67 306$\times$3 $2^{-1},2^{-6}$ $2,2,2^{-6}$ $1$, 0.6, $2^{9}$ $2^{3},2^{4},2^{-6}$ German 70.1, 14.69 70.2, 14.34 70, 7.33 71.5, 35.33 1000$\times$24 $2^{-2},2^{-6}$ $2^{-4},2^{-4},2^{-6}$ $2^{-3}$, 0.4, $2^{7}$ $2^{9},2^{4},2^{-6}$ House Votes 92.64, 1.22 93.55, 2.20 91.70, 1.78 94.71, 4.12 435$\times$16 $2^{8},2^{-6}$ $2,2^{-2},2^{-6}$ $1$, 0.1, $2^{-5}$ $2^{3},2^{4},2^{-3}$ Spect 71.89, 0.86 73.76, 1.66 68.90, 1.66 74.17, 1.45 237$\times$22 $2^{-5},2^{-6}$ $2^{-3},2^{-6},2^{-4}$ $2^{-7}$, 0.6, $2^{-5}$ $2^{3},2^{4},2^{-1}$ Splice 76.71, 15.98 75.20, 16.58 74.41, 14.66 87.20, 42.10 1000$\times$60 $2^{-6},2^{-6}$ $2^{-5},2^{-5},2^{-6}$ $2^{-4}$, 0.2, $2^{-3}$ $2^{5},2^{5},2^{-6}$ Lung-caner 80.83, 0.67 85.83, 1.41 84.16, 1.51 85, 0.35 32$\times$56 $2,2^{-6}$ $2^{-2},2^{-3},2^{-6}$ $2^{-5}$, 0.1, $2^{-3}$ $2^{5},2^{-8},2^{-6}$ F-diagnosis 88.14, 0.64 88.14, 1.42 87.16, 1.59 89.25, 0.44 100$\times$9 $2^{5},2^{-3}$ $2^{6},2^{5},2^{-3}$ $2^{10}$, 0.2, $2^{7}$ $2^{4},2^{-7},2^{-2}$ Breast-cancer 55.16, 0.65 55.16, 1.48 55.15, 1.56 60.31, 0.48 116$\times$9 $2^{5},2^{-3}$ $2^{3},2^{-5},2^{-3}$ $2^{7}$, 0.9, $2^{-6}$ $2^{4},2^{-7},2^{-6}$ Bupa 64.37, 1.19 64.35, 2.02 65.80, 1.94 66.36, 2.62 345$\times$6 $2^{-4},2^{-6}$ $2^{3},2,2^{-3}$ $2^{-5}$, 0.2, $2^{-3}$ $2^{3},2^{4},2^{-6}$ Pima 65.63, 9.66 66.15, 8.08 65.12, 4.74 69, 20.07 768$\times$9 $2^{-3},2^{-6}$ $2^{-6},2^{-6},2^{-6}$ $2^{-3}$, 0.6, $2^{-6}$ $2^{2},2^{5},2^{-3}$ Housing 93.09, 2.12 93.09, 3.65 93.09, 4.30 93.09, 7.01 506$\times$14 $2^{-6},2^{6}$ $2^{-5},2^{-3},2^{7}$ $2^{10}$, 0.7, $2^{7}$ $2^{7},2^{-8},2^{-6}$ Avg.acc 76.37 76.86 77.16 80.46
Rank of accuracy linear classifiers on UCI benchmark data sets
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM Sonar 3 2 4 1 Cancer 3 1 4 2 Diabet 3 1 4 2 Wdbc 3 2 4 1 Ionosphere 3 1 4 2 Australian 3 1 4 2 Heart 2.5 2.5 4 1 Haberman 3 2 4 1 German 3 1 4 2 House Votes 3 2 4 1 Spect 4 3 1 2 Splice 4 1 3 2 Lung-cancer 3.5 3.5 2 1 F-diagnosis 2 1 4 3 Breast-cancer 2 3 4 1 Bupa 4 3 1 2 Pima 3 2 4 1 Housing 3 2 4 1 Average rank 3.06 1.89 3.5 1.56
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM Sonar 3 2 4 1 Cancer 3 1 4 2 Diabet 3 1 4 2 Wdbc 3 2 4 1 Ionosphere 3 1 4 2 Australian 3 1 4 2 Heart 2.5 2.5 4 1 Haberman 3 2 4 1 German 3 1 4 2 House Votes 3 2 4 1 Spect 4 3 1 2 Splice 4 1 3 2 Lung-cancer 3.5 3.5 2 1 F-diagnosis 2 1 4 3 Breast-cancer 2 3 4 1 Bupa 4 3 1 2 Pima 3 2 4 1 Housing 3 2 4 1 Average rank 3.06 1.89 3.5 1.56
Rank of accuracy nonlinear classifiers on UCI benchmark data sets
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM Sonar 3 2 4 1 Cancer 2.5 2.5 4 1 Diabet 3 2 4 1 Wdbc 3.5 3.5 1 2 Ionosphere 1.5 3 4 1.5 Australian 2 4 3 1 Heart 3 1.5 4 1.5 Haberman 1 2.5 4 2.5 German 3 2 4 1 House Votes 3 2 4 1 Spect 3 2 4 1 Splice 2 3 4 1 Lung-cancer 4 1 3 2 F-diagnosis 2.5 2.5 4 1 Breast-cancer 2.5 2.5 4 1 Bupa 3 4 2 1 Pima 2 3 4 1 Housing 2.5 2.5 2.5 2.5 Average rank 2.61 2.53 3.53 1.33
 Data set TWSVM TBSVM I$\nu$-TBSVM AL-STBSVM Sonar 3 2 4 1 Cancer 2.5 2.5 4 1 Diabet 3 2 4 1 Wdbc 3.5 3.5 1 2 Ionosphere 1.5 3 4 1.5 Australian 2 4 3 1 Heart 3 1.5 4 1.5 Haberman 1 2.5 4 2.5 German 3 2 4 1 House Votes 3 2 4 1 Spect 3 2 4 1 Splice 2 3 4 1 Lung-cancer 4 1 3 2 F-diagnosis 2.5 2.5 4 1 Breast-cancer 2.5 2.5 4 1 Bupa 3 4 2 1 Pima 2 3 4 1 Housing 2.5 2.5 2.5 2.5 Average rank 2.61 2.53 3.53 1.33
 [1] Xiaojiao Tong, Shuzi Zhou. A smoothing projected Newton-type method for semismooth equations with bound constraints. Journal of Industrial and Management Optimization, 2005, 1 (2) : 235-250. doi: 10.3934/jimo.2005.1.235 [2] Yubo Yuan, Weiguo Fan, Dongmei Pu. Spline function smooth support vector machine for classification. Journal of Industrial and Management Optimization, 2007, 3 (3) : 529-542. doi: 10.3934/jimo.2007.3.529 [3] Yubo Yuan. Canonical duality solution for alternating support vector machine. Journal of Industrial and Management Optimization, 2012, 8 (3) : 611-621. doi: 10.3934/jimo.2012.8.611 [4] Qingsong Duan, Mengwei Xu, Yue Lu, Liwei Zhang. A smoothing augmented Lagrangian method for nonconvex, nonsmooth constrained programs and its applications to bilevel problems. Journal of Industrial and Management Optimization, 2019, 15 (3) : 1241-1261. doi: 10.3934/jimo.2018094 [5] R. Baier, M. Dellnitz, M. Hessel-von Molo, S. Sertl, I. G. Kevrekidis. The computation of convex invariant sets via Newton's method. Journal of Computational Dynamics, 2014, 1 (1) : 39-69. doi: 10.3934/jcd.2014.1.39 [6] Liqun Qi, Zheng yan, Hongxia Yin. Semismooth reformulation and Newton's method for the security region problem of power systems. Journal of Industrial and Management Optimization, 2008, 4 (1) : 143-153. doi: 10.3934/jimo.2008.4.143 [7] Yanhong Yuan, Hongwei Zhang, Liwei Zhang. A smoothing Newton method for generalized Nash equilibrium problems with second-order cone constraints. Numerical Algebra, Control and Optimization, 2012, 2 (1) : 1-18. doi: 10.3934/naco.2012.2.1 [8] Cheng-Dar Liou. Note on "Cost analysis of the M/M/R machine repair problem with second optional repair: Newton-Quasi method". Journal of Industrial and Management Optimization, 2012, 8 (3) : 727-732. doi: 10.3934/jimo.2012.8.727 [9] Kuo-Hsiung Wang, Chuen-Wen Liao, Tseng-Chang Yen. Cost analysis of the M/M/R machine repair problem with second optional repair: Newton-Quasi method. Journal of Industrial and Management Optimization, 2010, 6 (1) : 197-207. doi: 10.3934/jimo.2010.6.197 [10] Ying Lin, Qi Ye. Support vector machine classifiers by non-Euclidean margins. Mathematical Foundations of Computing, 2020, 3 (4) : 279-300. doi: 10.3934/mfc.2020018 [11] Jian Luo, Shu-Cherng Fang, Yanqin Bai, Zhibin Deng. Fuzzy quadratic surface support vector machine based on fisher discriminant analysis. Journal of Industrial and Management Optimization, 2016, 12 (1) : 357-373. doi: 10.3934/jimo.2016.12.357 [12] Xin Li, Ziguan Cui, Linhui Sun, Guanming Lu, Debnath Narayan. Research on iterative repair algorithm of Hyperchaotic image based on support vector machine. Discrete and Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1199-1218. doi: 10.3934/dcdss.2019083 [13] Xin Yan, Hongmiao Zhu. A kernel-free fuzzy support vector machine with Universum. Journal of Industrial and Management Optimization, 2021  doi: 10.3934/jimo.2021184 [14] Ahmad Mousavi, Zheming Gao, Lanshan Han, Alvin Lim. Quadratic surface support vector machine with L1 norm regularization. Journal of Industrial and Management Optimization, 2022, 18 (3) : 1835-1861. doi: 10.3934/jimo.2021046 [15] Egil Bae, Xue-Cheng Tai, Wei Zhu. Augmented Lagrangian method for an Euler's elastica based segmentation model that promotes convex contours. Inverse Problems and Imaging, 2017, 11 (1) : 1-23. doi: 10.3934/ipi.2017001 [16] Peili Li, Xiliang Lu, Yunhai Xiao. Smoothing Newton method for $\ell^0$-$\ell^2$ regularized linear inverse problem. Inverse Problems and Imaging, 2022, 16 (1) : 153-177. doi: 10.3934/ipi.2021044 [17] Yan Li, Liping Zhang. A smoothing Newton method preserving nonnegativity for solving tensor complementarity problems with $P_0$ mappings. Journal of Industrial and Management Optimization, 2022  doi: 10.3934/jimo.2022041 [18] Matthias Gerdts, Martin Kunkel. A nonsmooth Newton's method for discretized optimal control problems with state and control constraints. Journal of Industrial and Management Optimization, 2008, 4 (2) : 247-270. doi: 10.3934/jimo.2008.4.247 [19] Henryk Leszczyński, Monika Wrzosek. Newton's method for nonlinear stochastic wave equations driven by one-dimensional Brownian motion. Mathematical Biosciences & Engineering, 2017, 14 (1) : 237-248. doi: 10.3934/mbe.2017015 [20] Andy M. Yip, Wei Zhu. A fast modified Newton's method for curvature based denoising of 1D signals. Inverse Problems and Imaging, 2013, 7 (3) : 1075-1097. doi: 10.3934/ipi.2013.7.1075

Impact Factor:

Tools

Article outline

Figures and Tables