# American Institute of Mathematical Sciences

eISSN:
2577-8838

All Issues

## Mathematical Foundations of Computing

February 2020 , Volume 3 , Issue 1

Select all articles

Export/Reference:

2020, 3(1): 1-9 doi: 10.3934/mfc.2020001 +[Abstract](1082) +[HTML](499) +[PDF](234.78KB)
Abstract:

In numerous real-world applications, the class imbalance problem is prevalent. When training samples of one class immensely outnumber samples of the other classes, the traditional machine learning algorithms show bias towards the majority class (a class with more number of samples) lead to significant losses of model performance. Several techniques have been proposed to handle the problem of class imbalance, including data sampling and boosting. In this paper, we present a cluster-based oversampling with boosting algorithm (Cluster+Boost) for learning from imbalanced data. We evaluate the performance of the proposed approach with state-of-the-art methods based on ensemble learning like AdaBoost, RUSBoost and SMOTEBoost. We conducted experiments on 22 data sets with various imbalance ratios. The experimental results are promising and provide an alternative approach for improving the performance of the classifier when learned on highly imbalanced data sets.

2020, 3(1): 11-24 doi: 10.3934/mfc.2020002 +[Abstract](816) +[HTML](400) +[PDF](407.62KB)
Abstract:

In this article, the authors consider the orbital stability of periodic traveling wave solutions for the coupled compound KdV and MKdV equations with two components

Firstly, we show that there exist a smooth curve of positive traveling wave solutions of dnoidal type with a fixed fundamental period \begin{document}$L$\end{document} for the coupled compound KdV and MKdV equations. Then, combining the orbital stability theory presented by Grillakis et al., and detailed spectral analysis given by using Lamé equation and Floquet theory, we show that the dnoidal type periodic wave solution with period \begin{document}$L$\end{document} is orbitally stable. As the modulus of the Jacobian elliptic function \begin{document}$k\rightarrow 1$\end{document}, we obtain the orbital stability results of solitary wave solution with zero asymptotic value for the coupled compound KdV and MKdV equations from our work. In addition, we also obtain the stability results for the coupled compound KdV and MKdV equations with the degenerate condition \begin{document}$v = 0$\end{document}, called the compound KdV and MKdV equation.

2020, 3(1): 25-40 doi: 10.3934/mfc.2020003 +[Abstract](782) +[HTML](397) +[PDF](404.23KB)
Abstract:

This paper aims at the regularized learning algorithm for regression associated with the correntropy induced losses in reproducing kernel Hilbert spaces. The main target is the error analysis for the regression problem in learning theory based on the maximum correntropy. Explicit learning rates are provided. From our analysis, when choosing a suitable parameter of the loss function, we obtain satisfactory learning rates. The rates depend on the regularization error and the covering numbers of the reproducing kernel Hilbert space.

2020, 3(1): 41-50 doi: 10.3934/mfc.2020004 +[Abstract](817) +[HTML](485) +[PDF](366.1KB)
Abstract:

In this paper, an asymptotic formula for the so-called multivariate neural network (NN) operators has been established. As a direct consequence, a first and a second order pointwise Voronovskaja type theorem has been reached. At the end, the particular case of the NN operators activated by the logistic function has been treated in details.

2020, 3(1): 51-64 doi: 10.3934/mfc.2020005 +[Abstract](871) +[HTML](426) +[PDF](1026.31KB)
Abstract:

To further enhance the performance of the current convolutional neural network, an improved deep convolutional neural network model is shown in this paper. Different from the traditional network structure, in our proposed method the pooling layer is replaced by two continuous convolutional layers with \begin{document}$3 \times 3$\end{document} convolution kernel between which a dropout layer is added to reduce overfitting, and cross entropy kernel is used as loss function. Experimental results on Mnist and Cifar-10 data sets for image classification show that, compared to several classical neural networks such as Alexnet, VGGNet and GoogleNet, the improved network achieve better performance in learning efficiency and recognition accuracy at relatively shallow network depths.