Loading [MathJax]/extensions/TeX/cellcolor_ieee.js
Sliding Mode Control based Support Vector Machine RBF Kernel Parameter Optimization | IEEE Conference Publication | IEEE Xplore

Sliding Mode Control based Support Vector Machine RBF Kernel Parameter Optimization


Abstract:

Support Vector Machine (SVM) is a learning-based algorithm, which is widely used for classification in many applications. Despite its advantages, its application to large...Show More

Abstract:

Support Vector Machine (SVM) is a learning-based algorithm, which is widely used for classification in many applications. Despite its advantages, its application to large scale datasets is limited due to its use of large number of support vectors and dependency of its performance on its kernel parameter. This paper presents a Sliding Mode Control based Support Vector Machine Radial Basis Function's kernel parameter optimization (SMC-SVM-RBF) method, inspired by sliding mode closed loop control theory, which has demonstrated significantly higher performance to that of the standard closed loop control technique. The proposed method first defines an error equation and a sliding surface and then iteratively updates the RBF's kernel parameter based on the sliding mode control theory, forcing SVM training error to converge below a predefined threshold value. The closed loop nature of the proposed algorithm increases the robustness of the technique to uncertainty and improves its convergence speed. Experimental results were generated using nine standard benchmark datasets covering wide range of applications. Results show the proposed SMC-SVM-RBF method is significantly faster than those of classical SVM based techniques. Moreover, it generates more accurate results than most of the state of the art SVM based methods..
Date of Conference: 09-10 December 2019
Date Added to IEEE Xplore: 27 February 2020
ISBN Information:
Print on Demand(PoD) ISSN: 1558-2809
Conference Location: Abu Dhabi, United Arab Emirates

I. Introduction

Support Vector Machine (SVM) is a machine learning algorithm that widely used for classification. SVM is one of the robust and efficient classification methods amongst the well know classification algorithms such as nearest neighbor, boosted decision trees, regularized logistic regression, neural networks, and random forests [1]–[3]. When dealing with non-linearly separable data, SVM maps the data into higher dimensional space using kernels prior to performing the classification [4]. SVM formulates a quadratic programming (QP) problem to find a separating hyperplane, which maximizes the margin between two classes of the data [3], [5], [6]. Since SVM achieves a unique solution and learns from dimensionality of feature space, it is more robust than other techniques to over fitting [4], [6], [7]. Despite all the advantages and applications of the SVM [8], [9], its classification speed is deteriorated when dealing with large scale problems as it uses large number of support vectors. In addition, its training computationally expensive and timely [10], [11]. Over the last two decades, many techniques have been proposed to speed up the test and training time of the SVM [5], [10]–[18] which have been resulted in techniques that reduce the number of SVs. However, there are demands for more powerful techniques. In some branches of control such as nonlinear [19], [20] and optimal control [21] SVM has been used due to its capabilities. However, the application of the Sliding Mode Control (SMC) to speed up the training period of the SVM and improving its performance has not been reported in the literature.

References

References is not available for this document.