Loading [MathJax]/extensions/MathZoom.js
Closed-form projection operator wavelet kernels in support vector learning for nonlinear dynamical systems identification | IEEE Conference Publication | IEEE Xplore

Closed-form projection operator wavelet kernels in support vector learning for nonlinear dynamical systems identification


Abstract:

As a special idempotent operator, the projection operator plays a crucial role in the Spectral Decomposition Theorem for linear operators in Hilbert space. In this paper,...Show More

Abstract:

As a special idempotent operator, the projection operator plays a crucial role in the Spectral Decomposition Theorem for linear operators in Hilbert space. In this paper, an innovative orthogonal projection operator wavelet kernel is developed for support vector learning. In the framework of multi-resolution analysis, the proposed wavelet kernel can easily fulfill the multi-scale, multidimensional learning to estimate complex dependencies. The peculiar advantage of the wavelet kernel developed in this paper lies in its expressivity in closed-form, which greatly facilitates its application in kernel learning. To our best knowledge, it is the first closed-form orthogonal projection wavelet kernel in the literature. In the scenario of linear programming support vector learning, the proposed closed-form projection operator wavelet kernel is used to identify a parallel model of a benchmark nonlinear dynamical system. A simulation study confirms its superiority in model accuracy and sparsity.
Date of Conference: 04-09 August 2013
Date Added to IEEE Xplore: 09 January 2014
ISBN Information:

ISSN Information:

Conference Location: Dallas, TX, USA
No metrics found for this document.

I. Introduction

THE kernel-based support vector (SV) learning was originally proposed and developed for nonlinear pattern recognition. As the cornerstone of nonlinear SV learning, the kernel functions play an essential role in providing a general framework to represent data. The use of kernel functions allows high-dimensional inner-product computations to be performed with very little overhead and brings all the benefits of linear estimation. Its simplicity and ease of application make it attractive to practitioners. However, the SV learning for approximating real-valued functions is more delicate than the approximation of indicator functions for pattern recognition. Various real-valued function estimation problems need various sets of approximating functions due to the complexity of the dependencies. The kernel determines the class of functions a SVM can draw its solution from. The choice of kernel significantly affects the performance of a SVM [1]–[2]. Therefore it is important to construct special kernels that reflect special properties of approximating functions [3]–[4].

Usage
Select a Year
2024

View as

Total usage sinceJan 2014:68
00.20.40.60.811.2JanFebMarAprMayJunJulAugSepOctNovDec000001000000
Year Total:1
Data is updated monthly. Usage includes PDF downloads and HTML views.
Contact IEEE to Subscribe

References

References is not available for this document.