Loading [a11y]/accessibility-menu.js
An Iterative Algorithm for Robust Kernel Principal Component Analysis | IEEE Conference Publication | IEEE Xplore

An Iterative Algorithm for Robust Kernel Principal Component Analysis


Abstract:

Principal component analysis (PCA) has been proven to be an efficient method in dimensionality reduction, feature extraction and pattern recognition. Kernel principal com...Show More

Abstract:

Principal component analysis (PCA) has been proven to be an efficient method in dimensionality reduction, feature extraction and pattern recognition. Kernel principal component analysis (KPCA) can be considered as a natural nonlinear generalization of PCA, which performs linear PCA in a high dimensional space implicitly by using kernel trick. However, both conventional PCA and KPCA suffer from the deficiency of being sensitive to outliers. Existing robust KPCA has to eigen-decompose the gram matrix directly in each step and is much more computationally infeasible due to the large size of the matrix when the number of training samples is large. By extending existing robust PCA algorithm using kernel methods, we present a novel robust adaptive algorithm for calculating the kernel principal components. The proposed method not only preserves the characteristic of capturing underlying nonlinear structure of KPCA but also is robust against outliers by restraining the effect of outlying samples. Compared with existing robust KPCA methods, our method is performed without having to store the kernel matrix, which can reduce significantly the storage burden. In addition, our method shows the potential of expansibility to the incremental learning version. Experimental results on synthetic data indicate that our improved algorithm is effective and promising.
Date of Conference: 19-22 August 2007
Date Added to IEEE Xplore: 29 October 2007
ISBN Information:

ISSN Information:

Conference Location: Hong Kong, China

1. Introduction

Principal Component Analysis (PCA) is an efficient method for dimensionality reduction, feature extraction, and has been widely used in many fields, such as image processing, statistical analysis and pattern recognition [1]. Conventional PCA is to find a linear orthogonal basis transformation by an eigen-decomposition of the centered covariance matrix of the data set. Dimensionality reduction and feature extraction are achieved by projecting input data into the subspace spanned by a set of principal eigenvectors corresponding to the largest eigenvalues.

Contact IEEE to Subscribe

References

References is not available for this document.