This issuePrevious ArticleAn analysis of staged purchases in deregulated time-sequential electricity marketsNext ArticleLinear fractional vector optimization problems with many components in the solution sets
Optimal parameter selection in support vector machines
The purpose of the paper is to apply a nonlinear programming algorithm for computing kernel and related parameters of
a support vector machine (SVM) by a two-level approach.
Available training data are split into two groups, one set for formulating a quadratic SVM with $L_2$-soft margin and another one
for minimizing the generalization error, where the optimal SVM variables are inserted.
Subsequently, the total generalization error is evaluated for a separate set of test data.
Derivatives of functions by which the optimization problem is defined, are evaluated in an analytical way, where
an existing Cholesky decomposition needed for solving the quadratic SVM, is exploited.
The approach is implemented and tested on a couple of standard data sets with up to 4,800 patterns.
The results show a significant reduction of the generalization error, an increase of the margin, and a reduction of the number of
support vectors in all cases where the data sets are sufficiently large.
By a second set of test runs, kernel parameters are assigned to individual features. Redundant attributes are identified and
suitable relative weighting factors are computed.