
eISSN:
2577-8838
All Issues
Mathematical Foundations of Computing
February 2020 , Volume 3 , Issue 1
Select all articles
Export/Reference:
In numerous real-world applications, the class imbalance problem is prevalent. When training samples of one class immensely outnumber samples of the other classes, the traditional machine learning algorithms show bias towards the majority class (a class with more number of samples) lead to significant losses of model performance. Several techniques have been proposed to handle the problem of class imbalance, including data sampling and boosting. In this paper, we present a cluster-based oversampling with boosting algorithm (Cluster+Boost) for learning from imbalanced data. We evaluate the performance of the proposed approach with state-of-the-art methods based on ensemble learning like AdaBoost, RUSBoost and SMOTEBoost. We conducted experiments on 22 data sets with various imbalance ratios. The experimental results are promising and provide an alternative approach for improving the performance of the classifier when learned on highly imbalanced data sets.
In this article, the authors consider the orbital stability of periodic traveling wave solutions for the coupled compound KdV and MKdV equations with two components
Firstly, we show that there exist a smooth curve of positive traveling wave solutions of dnoidal type with a fixed fundamental period
This paper aims at the regularized learning algorithm for regression associated with the correntropy induced losses in reproducing kernel Hilbert spaces. The main target is the error analysis for the regression problem in learning theory based on the maximum correntropy. Explicit learning rates are provided. From our analysis, when choosing a suitable parameter of the loss function, we obtain satisfactory learning rates. The rates depend on the regularization error and the covering numbers of the reproducing kernel Hilbert space.
In this paper, an asymptotic formula for the so-called multivariate neural network (NN) operators has been established. As a direct consequence, a first and a second order pointwise Voronovskaja type theorem has been reached. At the end, the particular case of the NN operators activated by the logistic function has been treated in details.
To further enhance the performance of the current convolutional neural network, an improved deep convolutional neural network model is shown in this paper. Different from the traditional network structure, in our proposed method the pooling layer is replaced by two continuous convolutional layers with
Readers
Authors
Editors
Referees
Librarians
Special Issues
Email Alert
Add your name and e-mail address to receive news of forthcoming issues of this journal:
[Back to Top]