
eISSN:
2577-8838
All Issues
Mathematical Foundations of Computing
May 2021 , Volume 4 , Issue 2
Select all articles
Export/Reference:
The classification of Hyperspectral Image (HSI) plays an important role in various fields. To achieve more precise multi-target classification in a short time, a method for combining discrete non-local theory with traditional variable fraction Potts models is presented in this paper. The nonlocal operator makes better use of the information in a certain region centered on that pixel. Meanwhile, adding the constraint in the model can ensure that every pixel in HSI has only one class. The proposed model has the characteristics of non-convex, nonlinear, and non-smooth so that it is difficult to achieve global optimization results. By introducing a series of auxiliary variables and using the alternating direction method of multipliers, the proposed classification model is transformed into a series of convex subproblems. Finally, we conducted comparison experiments with support vector machine (SVM), K-nearest neighbor (KNN), and convolutional neural network (CNN) on five different dimensional HSI data sets. The numerical results further illustrate that the proposed method is stable and efficient and our algorithm can get more accurate predictions in a shorter time, especially when classifying data sets with more spectral layers.
We introduce the concept of interval harmonical
In this paper, we investigate explicit exact traveling wave solutions of the generalized (3+1)-dimensional KP equation
describing the dynamics of solitons and nonlinear waves in the field of plasma physics and fluid dynamics, where
The 3D micropolar system with a damping term is considered by the uniform estimates. In this paper, global attractors of the 3D micropolar equations with damping term are proved for
In this paper, we evaluate two deep learning models which integrate convolutional and recurrent neural networks. We implement both sequential and parallel architectures for fine-grain musical subgenre classification. Due to the exceptionally low signal to noise ratio (SNR) of our low level mel-spectrogram dataset, more sensitive yet robust learning models are required to generate meaningful results. We investigate the effects of three commonly applied optimizers, dropout, batch regularization, and sensitivity to varying initialization distributions. The results demonstrate that the sequential model specifically requires the RMSprop optimizer, while the parallel model implemented with the Adam optimizer yielded encouraging and stable results achieving an average F1 score of
2021 CiteScore: 0.2
Readers
Authors
Editors
Referees
Librarians
Special Issues
Email Alert
Add your name and e-mail address to receive news of forthcoming issues of this journal:
[Back to Top]