Loading [MathJax]/extensions/MathZoom.js
Closed-form projection operator wavelet kernels in support vector learning for nonlinear dynamical systems identification | IEEE Conference Publication | IEEE Xplore

Closed-form projection operator wavelet kernels in support vector learning for nonlinear dynamical systems identification


Abstract:

As a special idempotent operator, the projection operator plays a crucial role in the Spectral Decomposition Theorem for linear operators in Hilbert space. In this paper,...Show More

Abstract:

As a special idempotent operator, the projection operator plays a crucial role in the Spectral Decomposition Theorem for linear operators in Hilbert space. In this paper, an innovative orthogonal projection operator wavelet kernel is developed for support vector learning. In the framework of multi-resolution analysis, the proposed wavelet kernel can easily fulfill the multi-scale, multidimensional learning to estimate complex dependencies. The peculiar advantage of the wavelet kernel developed in this paper lies in its expressivity in closed-form, which greatly facilitates its application in kernel learning. To our best knowledge, it is the first closed-form orthogonal projection wavelet kernel in the literature. In the scenario of linear programming support vector learning, the proposed closed-form projection operator wavelet kernel is used to identify a parallel model of a benchmark nonlinear dynamical system. A simulation study confirms its superiority in model accuracy and sparsity.
Date of Conference: 04-09 August 2013
Date Added to IEEE Xplore: 09 January 2014
ISBN Information:

ISSN Information:

Conference Location: Dallas, TX, USA
References is not available for this document.

I. Introduction

THE kernel-based support vector (SV) learning was originally proposed and developed for nonlinear pattern recognition. As the cornerstone of nonlinear SV learning, the kernel functions play an essential role in providing a general framework to represent data. The use of kernel functions allows high-dimensional inner-product computations to be performed with very little overhead and brings all the benefits of linear estimation. Its simplicity and ease of application make it attractive to practitioners. However, the SV learning for approximating real-valued functions is more delicate than the approximation of indicator functions for pattern recognition. Various real-valued function estimation problems need various sets of approximating functions due to the complexity of the dependencies. The kernel determines the class of functions a SVM can draw its solution from. The choice of kernel significantly affects the performance of a SVM [1]–[2]. Therefore it is important to construct special kernels that reflect special properties of approximating functions [3]–[4].

Select All
1.
M. O. Stitson, A. Gammerman, V. Vapnik, V. Vovk, C. Watkins, J. Weston, "Support vector regression with ANOVA decomposition kernels", In Advances in Kernel Methods: Support Vector Learning, B. Schölkopf, C. J. C. Burges, A. J. Smola, Ed. 1999, pp. 285-291.
2.
B. Schölkopf, and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press, Cambridge, 2002.
3.
V. N. Vapnik, Statistical Learning Theory, Wiley, 1998.
4.
V. N. Vapnik, The Nature of Statistical Learning Theory, Springer, 2000.
5.
I. Daubechies, Ten Lectures on Wavelets, SIAM, 1992.
6.
A. I. Zayed, G. G. Walter, Wavelets in Closed Forms, in Wavelet Transforms Time-frequency Signal Analysis, L. Debnath, Ed. Birkhaüser, 2001, pp. 121-143 .
7.
S. Mallat, A Wavelet Tour of Signal Processing: The Sparse Way, Academic Press, New York, 2009.
8.
R.T. Ogden, Essential wavelets for statistical applications and data analysis, Birkhaüser, 1997.
9.
G. G. Walter, and X. P. Shen, Wavelets and other Orthogonal Systems, Chapman Hall/CRC, 2000.
10.
G. G. Walter, and J. Zhang, "Orthonormal wavelets with simple closed-from expression, " IEEE Trans. Signal Processing, vol. 46, pp. 2248-2251, 1998.
11.
Y. Meyer, and R. Coifman, Wavelets: Calderón-Zygmund and Multilinear Operators, Cambridge University Press, 1997.
12.
L. Zhang, W.D. Zhou, and L.C. Jiao, "Wavelet support vector machines, " IEEE Trans. Systems, Man and Cybernetics-Part B: Cybernetics, vol. 34, no. 1, pp. 34-39, 2004.
13.
A. Widodo, B.S. Yang, "Wavelet support vector machine for induction machine fault diagnosis based on transient current signal, " Expert Systems with Applications, vol. 35, pp. 307-316, 2008.
14.
Q. Wu, "The forecasting model based on wavelet v-support vector machine, " Expert Systems with Applications, vol. 36, pp. 7604-7610, 2009.
15.
Z. Lu, J. Sun and K. Butts, "Linear programming support vector regression with wavelet kernel: A new approach to nonlinear dynamical systems identification, " Mathematics and Computers in Simulation, vol. 79, no. 7, pp. 2051-2063, 2009.
16.
B. Bidakovic, Statistical Modeling by Wavelets, John Wiley Sons, 1999.
17.
W. F. Zhang, D. Q. Dai, and H. Yan, "Framelet kernels with applications to support vector regression and regularization networks, " IEEE Trans. SMC-Part B, vol. 40, pp. 1128-1143, 2010.
18.
K. Singla, and J. L. Junkins, Multi-resolution Methods for Modeling and Control of Dynamical Systems, Chapman Hall/CRC, 2009.
19.
R. Opfer, "Multiscale kernels, " Advances in Computational Mathematics, vol. 25, pp. 357-380, 2006.
20.
R. Opfer, Multiscale kernels, Shaker Verlag GmbH, Germany, 2004.
21.
S. Xie, A.T. Lawniczak, S. Krishnan, P. Liò, "Wavelet kernel principal component analysis in noisy multi-scale data classification, " ISRN Computational Mathematics, vol. 2012, pp. 1-13, 2012.
22.
S. Xie, A.T. Lawniczak, P. Liò, "Features extraction via wavelet kernel PCA for data classification, " In IEEE Int. Workshop on Machine Learning for Signal Processing, pp. 438-443, 2010.
23.
S. K. Berberian, Introduction to Hilbert Space, AMS Chelsea Publishing, 1999.
24.
A. J. Smola, and B. Scholkopt, "A tutorial on support vector regression, " Statistics and Computing, vol.14, pp. 199-222, 2004.
25.
J. Krebs, "Support vector regression for the solution of linear integral equations, " Inverse Problems, vol. 27, pp. 1-23, 2011.
26.
V. Kecman, Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models, MIT Press, Cambridge, MA, 2001.
27.
A. Smola, B. Scholkopt, G. Ratsch, "Linear programs for automatic accuracy control in regression, " in 9th Int. Conf. on Artificial Neural Networks, 1999, pp. 575-580.
28.
V. Kecman, I. Hadzic, "Support vectors selection by linear programming, " in Int. Joint Conf. on Neural Networks, 2000, pp. 193- 198.
29.
O. Nelles, Nonlinear Systems Identification: From Classical Approaches to Neural Networks and Fuzzy Models, Springer, 2001.
30.
M. Martinez-Ramon, J. L. Rojo-Alvarez, G. Camps-Valls, J. Munoz- Mari, A. Artes-Rodriguex, A.R. Figueiras-Vidal, "Support vector machines for nonlinear kernel ARMA system identification, " IEEE Trans. Neural Networks, vol. 17, pp. 1617-1622, 2006.

Contact IEEE to Subscribe

References

References is not available for this document.