Abstract:
Support Vector Machines (SVMs) are widely used, powerful learning algorithmsbased on statistical learning theory. In SVM learning, the data are mapped to a highdimensional space via a non-linear kernel function and a maximal margin hyper-plane separating the positive and negative instances is found in this new space. Choosing thebest kernel and generalization to multi-class cases are fundamental problems in SVMlearning.In literature, the best kernel function is chosen by using trial and error. We propose to use a cross-validation based model selection method to find the optimalkernel. Candidate classifiers, with different kernels are trained and hybrid modelsare built by selecting the best model using the 5 X 2 cross-validation F test. Theperformance of the proposed hybrid model is compared to the performances of classicalSVMs in one vs. all (OVA) and pairwise classification schemes in terms of accuracyand complexity (as measured by the number of support vectors stored).The basic two-class support vector machine needs to be extended to handle themulti-class problems and for this purpose, we incorporate our proposed hybrid modelsin one vs. all , pairwise and error-correcting output code (ECOC) schemes. We see thatthe proposed hybrid model together with ECOC finds accurate solutions for multi-classproblems without significantly increasing complexity.