# American Institute of Mathematical Sciences

• Previous Article
Dynamical analysis of a toxin-producing phytoplankton-zooplankton model with refuge
• MBE Home
• This Issue
• Next Article
Modelling chemistry and biology after implantation of a drug-eluting stent. Part Ⅰ: Drug transport
April  2017, 14(2): 511-527. doi: 10.3934/mbe.2017031

## Machine learning of swimming data via wisdom of crowd and regression analysis

 1 School of Computer Engineering and Science, Shanghai University, 99 Shangda Road, Shanghai 200444, China 2 University High School, 4771 Campus Drive, Irvine, CA 92612, USA 3 Department of Mathematics, Center for Mathematical and Computational Biology, University of California, Irvine, CA 92697, USA

* Corresponding author: qnie@math.uci.edu

Received  July 2016 Accepted  August 05, 2016 Published  October 2016

Fund Project: This work was partially supported by the Major Research Plan of NSFC [No. 91330116], and the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, and National Science Foundation grants DMS1161621 and DMS1562176.

Every performance, in an officially sanctioned meet, by a registered USA swimmer is recorded into an online database with times dating back to 1980. For the first time, statistical analysis and machine learning methods are systematically applied to 4,022,631 swim records. In this study, we investigate performance features for all strokes as a function of age and gender. The variances in performance of males and females for different ages and strokes were studied, and the correlations of performances for different ages were estimated using the Pearson correlation. Regression analysis show the performance trends for both males and females at different ages and suggest critical ages for peak training. Moreover, we assess twelve popular machine learning methods to predict or classify swimmer performance. Each method exhibited different strengths or weaknesses in different cases, indicating no one method could predict well for all strokes. To address this problem, we propose a new method by combining multiple inference methods to derive Wisdom of Crowd Classifier (WoCC). Our simulation experiments demonstrate that the WoCC is a consistent method with better overall prediction accuracy. Our study reveals several new age-dependent trends in swimming and provides an accurate method for classifying and predicting swimming times.

Citation: Jiang Xie, Junfu Xu, Celine Nie, Qing Nie. Machine learning of swimming data via wisdom of crowd and regression analysis. Mathematical Biosciences & Engineering, 2017, 14 (2) : 511-527. doi: 10.3934/mbe.2017031
##### References:
 [1] M. Bächlin and G. Tröster, Swimming performance and technique evaluation with wearable acceleration sensors, Pervasive and Mobile Computing, 8 (2012), 68-81.   Google Scholar [2] R. C. Barros, M. P. Basgalupp, A. C. De Carvalho and A. Freitas, A survey of evolutionary algorithms for decision-tree induction, Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 42 (2012), 291-312.  doi: 10.1109/TSMCC.2011.2157494.  Google Scholar [3] D. Basak, S. Pal and D. C. Patranabis, Support vector regression, Neural Information Processing-Letters and Reviews, 11 (2007), 203-224.   Google Scholar [4] C. Cai, G. Wang, Y. Wen, J. Pei, X. Zhu and W. Zhuang, Superconducting transition temperature t c estimation for superconductors of the doped mgb2 system using topological index via support vector regression, Journal of Superconductivity and Novel Magnetism, 23 (2010), 745-748.  doi: 10.1007/s10948-010-0727-7.  Google Scholar [5] D. Cai, X. He and J. Han, Semi-supervised discriminant analysis, in Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, IEEE, 2007, 1-7. doi: 10.1109/ICCV.2007.4408856.  Google Scholar [6] J. Cao, S. Kwong and R. Wang, A noise-detection based adaboost algorithm for mislabeled data, Pattern Recognition, 45 (2012), 4451-4465.  doi: 10.1016/j.patcog.2012.05.002.  Google Scholar [7] J. J. Cheh, R. S. Weinberg and K. C. Yook, An application of an artificial neural network investment system to predict takeover targets, Journal of Applied Business Research (JABR), 15 (2013), 33-46.  doi: 10.19030/jabr.v15i4.8151.  Google Scholar [8] J. L. Dye and V. A. Nicely, A general purpose curve fitting program for class and research use, Journal of chemical Education, 48 (1971), 443.  doi: 10.1021/ed048p443.  Google Scholar [9] M. A. Friedl and C. E. Brodley, Decision tree classification of land cover from remotely sensed data, Remote Sensing of Environment, 61 (1997), 399-409.  doi: 10.1016/S0034-4257(97)00049-7.  Google Scholar [10] K. Fukunaga and P. M. Narendra, A branch and bound algorithm for computing k-nearest neighbors, Computers, IEEE Transactions on, 100 (1975), 750-753.  doi: 10.1109/T-C.1975.224297.  Google Scholar [11] A. Garg and K. Tai, Comparison of regression analysis, artificial neural network and genetic programming in handling the multicollinearity problem, in Modelling, Identification & Control (ICMIC), 2012 Proceedings of International Conference on, IEEE, 2012,353-358. Google Scholar [12] Z. Guo, W. Zhao, H. Lu and J. Wang, Multi-step forecasting for wind speed using a modified emd-based artificial neural network model, Renewable Energy, 37 (2012), 241-249.  doi: 10.1016/j.renene.2011.06.023.  Google Scholar [13] I. Hmeidi, B. Hawashin and E. El-Qawasmeh, Performance of knn and svm classifiers on full word arabic articles, Advanced Engineering Informatics, 22 (2008), 106-111.  doi: 10.1016/j.aei.2007.12.001.  Google Scholar [14] Y. Jiang, J. Lin, B. Cukic and T. Menzies, Variance analysis in software fault prediction models, in Software Reliability Engineering, 2009. ISSRE'09. 20th International Symposium on, IEEE, 2009, 99-108. doi: 10.1109/ISSRE.2009.13.  Google Scholar [15] A. Liaw and M. Wiener, Classification and regression by randomforest, R news, 2 (2002), 18-22.   Google Scholar [16] B. Liu and G. Qiu, Illuminant classification based on random forest, in Machine Vision Applications (MVA), 2015 14th IAPR International Conference on, IEEE, 2015,106-109. doi: 10.1109/MVA.2015.7153144.  Google Scholar [17] D. Marbach, R. J. Prill, T. Schaffter, C. Mattiussi, D. Floreano and G. Stolovitzky, Revealing strengths and weaknesses of methods for gene network inference, Proceedings of the National Academy of Sciences, 107 (2010), 6286-6291.  doi: 10.1073/pnas.0913357107.  Google Scholar [18] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot and E. Duchesnay, Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12 (2011), 2825-2830.   Google Scholar [19] M.-T. Puth, M. Neuhäuser and G. D. Ruxton, Effective use of pearson's product--moment correlation coefficient, Animal Behaviour, 93 (2014), 183-189.  doi: 10.1016/j.anbehav.2014.05.003.  Google Scholar [20] G. Rätsch, T. Onoda and K.-R. Müller, Soft margins for adaboost, Machine Learning, 42 (2001), 287-320.   Google Scholar [21] J. F. Reis, F. B. Alves, P. M. Bruno, V. Vleck and G. P. Millet, Oxygen uptake kinetics and middle distance swimming performance, Journal of Science and Medicine in Sport, 15 (2012), 58-63.  doi: 10.1016/j.jsams.2011.05.012.  Google Scholar [22] B. Scholkopft and K.-R. Mullert, Fisher discriminant analysis with kernels, Neural Networks for Signal Processing Ⅸ, 1 (1999), p1.   Google Scholar [23] C. Schüldt, I. Laptev and B. Caputo, Recognizing human actions: A local svm approach, in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, IEEE, 3 (2004), 32-36. Google Scholar [24] A. J. Smola and B. Schölkopf, A tutorial on support vector regression, Statistics and Computing, 14 (2004), 199-222.  doi: 10.1023/B:STCO.0000035301.49549.88.  Google Scholar [25] M. Vaso, B. Knechtle, C. A. Rüst, T. Rosemann and R. Lepers, Age of peak swim speed and sex difference in performance in medley and freestyle swimming. a comparison between 200 m and 400 m in swiss elite swimmers, Journal of Human Sport and Exercise, 8 (2013), 954-965.  doi: 10.4100/jhse.2013.84.06.  Google Scholar [26] Q. Wang, G. M. Garrity, J. M. Tiedje and J. R. Cole, Naive bayesian classifier for rapid assignment of rrna sequences into the new bacterial taxonomy, Applied and Environmental Microbiology, 73 (2007), 5261-5267.  doi: 10.1128/AEM.00062-07.  Google Scholar [27] S.-C. Wang, Artificial neural network, in Interdisciplinary Computing in Java Programming, Springer, 2003, 81-100. Google Scholar [28] C.-H. Wu, J.-M. Ho and D.-T. Lee, Travel-time prediction with support vector regression, Intelligent Transportation Systems, IEEE Transactions on, 5 (2004), 276-281.  doi: 10.1109/TITS.2004.837813.  Google Scholar [29] J. Wu, Z. Cai, S. Zeng and X. Zhu, Artificial immune system for attribute weighted naive bayes classification, in Neural Networks (IJCNN), The 2013 International Joint Conference on, IEEE, 2013, 1-8. doi: 10.1109/IJCNN.2013.6706818.  Google Scholar

show all references

##### References:
 [1] M. Bächlin and G. Tröster, Swimming performance and technique evaluation with wearable acceleration sensors, Pervasive and Mobile Computing, 8 (2012), 68-81.   Google Scholar [2] R. C. Barros, M. P. Basgalupp, A. C. De Carvalho and A. Freitas, A survey of evolutionary algorithms for decision-tree induction, Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 42 (2012), 291-312.  doi: 10.1109/TSMCC.2011.2157494.  Google Scholar [3] D. Basak, S. Pal and D. C. Patranabis, Support vector regression, Neural Information Processing-Letters and Reviews, 11 (2007), 203-224.   Google Scholar [4] C. Cai, G. Wang, Y. Wen, J. Pei, X. Zhu and W. Zhuang, Superconducting transition temperature t c estimation for superconductors of the doped mgb2 system using topological index via support vector regression, Journal of Superconductivity and Novel Magnetism, 23 (2010), 745-748.  doi: 10.1007/s10948-010-0727-7.  Google Scholar [5] D. Cai, X. He and J. Han, Semi-supervised discriminant analysis, in Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, IEEE, 2007, 1-7. doi: 10.1109/ICCV.2007.4408856.  Google Scholar [6] J. Cao, S. Kwong and R. Wang, A noise-detection based adaboost algorithm for mislabeled data, Pattern Recognition, 45 (2012), 4451-4465.  doi: 10.1016/j.patcog.2012.05.002.  Google Scholar [7] J. J. Cheh, R. S. Weinberg and K. C. Yook, An application of an artificial neural network investment system to predict takeover targets, Journal of Applied Business Research (JABR), 15 (2013), 33-46.  doi: 10.19030/jabr.v15i4.8151.  Google Scholar [8] J. L. Dye and V. A. Nicely, A general purpose curve fitting program for class and research use, Journal of chemical Education, 48 (1971), 443.  doi: 10.1021/ed048p443.  Google Scholar [9] M. A. Friedl and C. E. Brodley, Decision tree classification of land cover from remotely sensed data, Remote Sensing of Environment, 61 (1997), 399-409.  doi: 10.1016/S0034-4257(97)00049-7.  Google Scholar [10] K. Fukunaga and P. M. Narendra, A branch and bound algorithm for computing k-nearest neighbors, Computers, IEEE Transactions on, 100 (1975), 750-753.  doi: 10.1109/T-C.1975.224297.  Google Scholar [11] A. Garg and K. Tai, Comparison of regression analysis, artificial neural network and genetic programming in handling the multicollinearity problem, in Modelling, Identification & Control (ICMIC), 2012 Proceedings of International Conference on, IEEE, 2012,353-358. Google Scholar [12] Z. Guo, W. Zhao, H. Lu and J. Wang, Multi-step forecasting for wind speed using a modified emd-based artificial neural network model, Renewable Energy, 37 (2012), 241-249.  doi: 10.1016/j.renene.2011.06.023.  Google Scholar [13] I. Hmeidi, B. Hawashin and E. El-Qawasmeh, Performance of knn and svm classifiers on full word arabic articles, Advanced Engineering Informatics, 22 (2008), 106-111.  doi: 10.1016/j.aei.2007.12.001.  Google Scholar [14] Y. Jiang, J. Lin, B. Cukic and T. Menzies, Variance analysis in software fault prediction models, in Software Reliability Engineering, 2009. ISSRE'09. 20th International Symposium on, IEEE, 2009, 99-108. doi: 10.1109/ISSRE.2009.13.  Google Scholar [15] A. Liaw and M. Wiener, Classification and regression by randomforest, R news, 2 (2002), 18-22.   Google Scholar [16] B. Liu and G. Qiu, Illuminant classification based on random forest, in Machine Vision Applications (MVA), 2015 14th IAPR International Conference on, IEEE, 2015,106-109. doi: 10.1109/MVA.2015.7153144.  Google Scholar [17] D. Marbach, R. J. Prill, T. Schaffter, C. Mattiussi, D. Floreano and G. Stolovitzky, Revealing strengths and weaknesses of methods for gene network inference, Proceedings of the National Academy of Sciences, 107 (2010), 6286-6291.  doi: 10.1073/pnas.0913357107.  Google Scholar [18] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot and E. Duchesnay, Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12 (2011), 2825-2830.   Google Scholar [19] M.-T. Puth, M. Neuhäuser and G. D. Ruxton, Effective use of pearson's product--moment correlation coefficient, Animal Behaviour, 93 (2014), 183-189.  doi: 10.1016/j.anbehav.2014.05.003.  Google Scholar [20] G. Rätsch, T. Onoda and K.-R. Müller, Soft margins for adaboost, Machine Learning, 42 (2001), 287-320.   Google Scholar [21] J. F. Reis, F. B. Alves, P. M. Bruno, V. Vleck and G. P. Millet, Oxygen uptake kinetics and middle distance swimming performance, Journal of Science and Medicine in Sport, 15 (2012), 58-63.  doi: 10.1016/j.jsams.2011.05.012.  Google Scholar [22] B. Scholkopft and K.-R. Mullert, Fisher discriminant analysis with kernels, Neural Networks for Signal Processing Ⅸ, 1 (1999), p1.   Google Scholar [23] C. Schüldt, I. Laptev and B. Caputo, Recognizing human actions: A local svm approach, in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, IEEE, 3 (2004), 32-36. Google Scholar [24] A. J. Smola and B. Schölkopf, A tutorial on support vector regression, Statistics and Computing, 14 (2004), 199-222.  doi: 10.1023/B:STCO.0000035301.49549.88.  Google Scholar [25] M. Vaso, B. Knechtle, C. A. Rüst, T. Rosemann and R. Lepers, Age of peak swim speed and sex difference in performance in medley and freestyle swimming. a comparison between 200 m and 400 m in swiss elite swimmers, Journal of Human Sport and Exercise, 8 (2013), 954-965.  doi: 10.4100/jhse.2013.84.06.  Google Scholar [26] Q. Wang, G. M. Garrity, J. M. Tiedje and J. R. Cole, Naive bayesian classifier for rapid assignment of rrna sequences into the new bacterial taxonomy, Applied and Environmental Microbiology, 73 (2007), 5261-5267.  doi: 10.1128/AEM.00062-07.  Google Scholar [27] S.-C. Wang, Artificial neural network, in Interdisciplinary Computing in Java Programming, Springer, 2003, 81-100. Google Scholar [28] C.-H. Wu, J.-M. Ho and D.-T. Lee, Travel-time prediction with support vector regression, Intelligent Transportation Systems, IEEE Transactions on, 5 (2004), 276-281.  doi: 10.1109/TITS.2004.837813.  Google Scholar [29] J. Wu, Z. Cai, S. Zeng and X. Zhu, Artificial immune system for attribute weighted naive bayes classification, in Neural Networks (IJCNN), The 2013 International Joint Conference on, IEEE, 2013, 1-8. doi: 10.1109/IJCNN.2013.6706818.  Google Scholar
The average variance and CV as a function of age for different strokes in the LCM
The average variance and CV as a function of age for different strokes in the SCY
The average variance and CV in time for different distances (LCM)
The average variance and CV in time for different distances (SCY)
Pearson Correlation coefficient between younger ages and age 18
100M freestyle performance regression analysis
100Y freestyle performance regression analysis
Classification model
Predictions of swimming times
Illustration of a Wisdom of Crowd Classifier(WoCC)
A sample of the USA Swimming data set
 Stroke Course Age Time (sec.) Power points 100Y_FR SCY 21 41.12 1053 100M_FL LCM 24 53.83 926 100M_FR LCM 25 50.01 930 200Y_FR SCY 20 96.52 897 400M_IM LCM 18 273.69 834 800M_FR LCM 16 520.64 750 … … … … …
 Stroke Course Age Time (sec.) Power points 100Y_FR SCY 21 41.12 1053 100M_FL LCM 24 53.83 926 100M_FR LCM 25 50.01 930 200Y_FR SCY 20 96.52 897 400M_IM LCM 18 273.69 834 800M_FR LCM 16 520.64 750 … … … … …
The coefficients of the prediction equation (LCM and SCY)
 course LCM SCY Female Group 1 0.28 -9.31 137.4 0.31 -10.55 140.4 Female Group 2 0.31 -10.44 146.4 0.34 -11.22 145.2 Female Group 3 0.33 -10.86 148.8 0.35 -11.48 146.6 Female Group 4 0.33 -10.85 148.2 0.35 -11.37 145.2 Male Group 1 0.33 -12.18 165.4 0.20 -7.99 126.5 Male Group 2 0.32 -11.87 163.1 0.24 -9.36 136.6 Male Group 3 0.32 -11.69 161.3 0.27 -10.24 143.2 Male Group 4 0.34 12.35 166.1 0.29 -10.78 147.3
 course LCM SCY Female Group 1 0.28 -9.31 137.4 0.31 -10.55 140.4 Female Group 2 0.31 -10.44 146.4 0.34 -11.22 145.2 Female Group 3 0.33 -10.86 148.8 0.35 -11.48 146.6 Female Group 4 0.33 -10.85 148.2 0.35 -11.37 145.2 Male Group 1 0.33 -12.18 165.4 0.20 -7.99 126.5 Male Group 2 0.32 -11.87 163.1 0.24 -9.36 136.6 Male Group 3 0.32 -11.69 161.3 0.27 -10.24 143.2 Male Group 4 0.34 12.35 166.1 0.29 -10.78 147.3
Definition of male swimming time standards in 100M freestyle levels
 Time Standards/Cuts Mean time in 18-year-old male (sec.) AAAA Min $time<=54.09$ AAA Min $54.0956.59$
 Time Standards/Cuts Mean time in 18-year-old male (sec.) AAAA Min $time<=54.09$ AAA Min $54.0956.59$
Parameters of each method
 Methods Parameters KNN $k$=5 Linear SVM $kernel$=linear RBF SVM $kernel$=rbf DT $max\_depth$=10 RF $max\_depth$=10, $n\_estimators$=10, $max\_features$=1 AdaBoost default parameters NB default parameters LDA default parameters QDA default parameters ANN 2 layers with 24 inputs 10 neurons in hidden layer SVR $kernel$=linear, $C$=1.0
 Methods Parameters KNN $k$=5 Linear SVM $kernel$=linear RBF SVM $kernel$=rbf DT $max\_depth$=10 RF $max\_depth$=10, $n\_estimators$=10, $max\_features$=1 AdaBoost default parameters NB default parameters LDA default parameters QDA default parameters ANN 2 layers with 24 inputs 10 neurons in hidden layer SVR $kernel$=linear, $C$=1.0
MAD of swimming time predictions for breaststroke
 Methods male 100M male 100Y female 100M female 100Y (103 Records) (179 Records) (143 Records) (210 Records) QPR 8.00s 7.70s 8.98s 8.42s ANN 1.20s 2.44s 2.97s 2.65s SVR 1.01s 1.90s 2.43s 1.67s
 Methods male 100M male 100Y female 100M female 100Y (103 Records) (179 Records) (143 Records) (210 Records) QPR 8.00s 7.70s 8.98s 8.42s ANN 1.20s 2.44s 2.97s 2.65s SVR 1.01s 1.90s 2.43s 1.67s
MAD of swimming time predictions for freestyle
 Methods male 100M male 100Y female 100M female 100Y (340 Records) (572 Records) (548 Records) (743 Records) QPR 6.70s 8.00s 8.21s 6.05s ANN 1.50s 1.24s 1.50s 1.17s SVR 1.18s 1.02s 1.28s 0.95s
 Methods male 100M male 100Y female 100M female 100Y (340 Records) (572 Records) (548 Records) (743 Records) QPR 6.70s 8.00s 8.21s 6.05s ANN 1.50s 1.24s 1.50s 1.17s SVR 1.18s 1.02s 1.28s 0.95s
Prediction accuracy for freestyle
 male 100M male 100Y female 100M female 100Y Methods (340 Records) (572 Records) (548 Records) (743 Records) KNN 0.55 0.60 0.59 0.58 Linear SVM 0.60 0.62 0.63 0.64 RBF SVM 0.58 0.61 0.66 0.63 DT 0.48 0.54 0.62 0.58 RF 0.59 0.60 0.67 0.64 AdaBoost 0.53 0.59 0.59 0.60 NB 0.53 0.49 0.56 0.55 LDA 0.60 0.60 0.64 0.63 QDA 0.52 0.56 0.58 0.53 WoCC 0.61 0.64 0.67 0.65
 male 100M male 100Y female 100M female 100Y Methods (340 Records) (572 Records) (548 Records) (743 Records) KNN 0.55 0.60 0.59 0.58 Linear SVM 0.60 0.62 0.63 0.64 RBF SVM 0.58 0.61 0.66 0.63 DT 0.48 0.54 0.62 0.58 RF 0.59 0.60 0.67 0.64 AdaBoost 0.53 0.59 0.59 0.60 NB 0.53 0.49 0.56 0.55 LDA 0.60 0.60 0.64 0.63 QDA 0.52 0.56 0.58 0.53 WoCC 0.61 0.64 0.67 0.65
Accuracy of prediction for breaststroke
 male 100M male 100Y female 100M female 100Y Methods (103 Records) (179 Records) (143 Records) (210 Records) KNN 0.50 0.46 0.75 0.64 Linear SVM 0.47 0.46 0.70 0.66 RBF SVM 0.37 0.36 0.66 0.54 DT 0.44 0.50 0.69 0.64 RF 0.46 0.49 0.75 0.63 AdaBoost 0.47 0.45 0.57 0.56 NB 0.53 0.53 0.66 0.65 LDA 0.45 0.45 0.67 0.64 QDA 0.41 0.41 0.63 0.56 WoCC 0.61 0.49 0.75 0.66
 male 100M male 100Y female 100M female 100Y Methods (103 Records) (179 Records) (143 Records) (210 Records) KNN 0.50 0.46 0.75 0.64 Linear SVM 0.47 0.46 0.70 0.66 RBF SVM 0.37 0.36 0.66 0.54 DT 0.44 0.50 0.69 0.64 RF 0.46 0.49 0.75 0.63 AdaBoost 0.47 0.45 0.57 0.56 NB 0.53 0.53 0.66 0.65 LDA 0.45 0.45 0.67 0.64 QDA 0.41 0.41 0.63 0.56 WoCC 0.61 0.49 0.75 0.66
 [1] Pankaj Sharma, David Baglee, Jaime Campos, Erkki Jantunen. Big data collection and analysis for manufacturing organisations. Big Data & Information Analytics, 2017, 2 (2) : 127-139. doi: 10.3934/bdia.2017002 [2] Nick Cercone, F'IEEE. What's the big deal about big data?. Big Data & Information Analytics, 2016, 1 (1) : 31-79. doi: 10.3934/bdia.2016.1.31 [3] Richard Boire. Understanding AI in a world of big data. Big Data & Information Analytics, 2018  doi: 10.3934/bdia.2018001 [4] Enrico Capobianco. Born to be big: Data, graphs, and their entangled complexity. Big Data & Information Analytics, 2016, 1 (2&3) : 163-169. doi: 10.3934/bdia.2016002 [5] Ali Asgary, Jianhong Wu. ADERSIM-IBM partnership in big data. Big Data & Information Analytics, 2016, 1 (4) : 277-278. doi: 10.3934/bdia.2016010 [6] Editorial Office. Retraction: Xiao-Qian Jiang and Lun-Chuan Zhang, Stock price fluctuation prediction method based on time series analysis. Discrete & Continuous Dynamical Systems - S, 2019, 12 (4&5) : 915-915. doi: 10.3934/dcdss.2019061 [7] Yu-Ting Lin, John Malik, Hau-Tieng Wu. Wave-shape oscillatory model for nonstationary periodic time series analysis. Foundations of Data Science, 2021, 3 (2) : 99-131. doi: 10.3934/fods.2021009 [8] Mahdi Mahdiloo, Abdollah Noorizadeh, Reza Farzipoor Saen. Developing a new data envelopment analysis model for customer value analysis. Journal of Industrial & Management Optimization, 2011, 7 (3) : 531-558. doi: 10.3934/jimo.2011.7.531 [9] Weidong Bao, Wenhua Xiao, Haoran Ji, Chao Chen, Xiaomin Zhu, Jianhong Wu. Towards big data processing in clouds: An online cost-minimization approach. Big Data & Information Analytics, 2016, 1 (1) : 15-29. doi: 10.3934/bdia.2016.1.15 [10] Yang Yu. Introduction: Special issue on computational intelligence methods for big data and information analytics. Big Data & Information Analytics, 2017, 2 (1) : i-ii. doi: 10.3934/bdia.201701i [11] Xiangmin Zhang. User perceived learning from interactive searching on big medical literature data. Big Data & Information Analytics, 2018  doi: 10.3934/bdia.2017019 [12] Yaguang Huangfu, Guanqing Liang, Jiannong Cao. MatrixMap: Programming abstraction and implementation of matrix computation for big data analytics. Big Data & Information Analytics, 2016, 1 (4) : 349-376. doi: 10.3934/bdia.2016015 [13] Sunmoo Yoon, Maria Patrao, Debbie Schauer, Jose Gutierrez. Prediction models for burden of caregivers applying data mining techniques. Big Data & Information Analytics, 2017  doi: 10.3934/bdia.2017014 [14] Mohammad Afzalinejad, Zahra Abbasi. A slacks-based model for dynamic data envelopment analysis. Journal of Industrial & Management Optimization, 2019, 15 (1) : 275-291. doi: 10.3934/jimo.2018043 [15] Michele La Rocca, Cira Perna. Designing neural networks for modeling biological data: A statistical perspective. Mathematical Biosciences & Engineering, 2014, 11 (2) : 331-342. doi: 10.3934/mbe.2014.11.331 [16] Zheng Dai, I.G. Rosen, Chuming Wang, Nancy Barnett, Susan E. Luczak. Using drinking data and pharmacokinetic modeling to calibrate transport model and blind deconvolution based data analysis software for transdermal alcohol biosensors. Mathematical Biosciences & Engineering, 2016, 13 (5) : 911-934. doi: 10.3934/mbe.2016023 [17] Tieliang Gong, Qian Zhao, Deyu Meng, Zongben Xu. Why curriculum learning & self-paced learning work in big/noisy data: A theoretical perspective. Big Data & Information Analytics, 2016, 1 (1) : 111-127. doi: 10.3934/bdia.2016.1.111 [18] Jian-Wu Xue, Xiao-Kun Xu, Feng Zhang. Big data dynamic compressive sensing system architecture and optimization algorithm for internet of things. Discrete & Continuous Dynamical Systems - S, 2015, 8 (6) : 1401-1414. doi: 10.3934/dcdss.2015.8.1401 [19] Pooja Bansal, Aparna Mehra. Integrated dynamic interval data envelopment analysis in the presence of integer and negative data. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021023 [20] Junying Hu, Xiaofei Qian, Jun Pei, Changchun Tan, Panos M. Pardalos, Xinbao Liu. A novel quality prediction method based on feature selection considering high dimensional product quality data. Journal of Industrial & Management Optimization, 2021  doi: 10.3934/jimo.2021099

2018 Impact Factor: 1.313

## Metrics

• PDF downloads (513)
• HTML views (2852)
• Cited by (0)

## Other articlesby authors

• on AIMS
• on Google Scholar

[Back to Top]