Abstract
This paper introduces an ideaofgenerating a kernel from an arbitrary function by embedding the training samples into the function. Based on this idea, we develop two nonlinear feature extraction methods: generating kernelprincipal component analysis (GKPCA) and generating kernel Fisher discriminant (GKFD). These two methods are shown to be equivalent to the function-mapping-space PCA (FMS-PCA) and the function-mapping-space LDA (FMS-LDA) methods, respectively. This equivalence reveals that the generating kernel is actually determined by the corresponding function map. From the generating kernel point of view, we can classify the current kernel Fisher Discriminant (KFD)algorithms into two categories: KPCA+LDA based algorithms and Straightforward KFD (SKFD) algorithms. The KPCA+LDA based algorithms directly work on the given kernel and are not suitable for non-kernel functions, while the SKFDalgorithms essentially work on the generating kernel from a given symmetricfunction and are therefore suitable for non-kernels as well as kernels.Experiments using the Face Recognition Grand Challenge (FRGC) and the Biometric Experimentation Environment (BEE) system show the effectiveness of the generating kernel based methods.In particular, the generating kernels from the sigmoid function and the fractional power polynomial function can achieve comparable or even better performance than the popular kernels like the polynomial kernel and the Gaussian kernel.
Original language | English (US) |
---|---|
Title of host publication | Perspectives on Pattern Recognition |
Publisher | Nova Science Publishers, Inc. |
Pages | 87-112 |
Number of pages | 26 |
ISBN (Print) | 9781612091181 |
State | Published - 2011 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- General Engineering
Keywords
- Face recognition grand challenge (FRGC)
- Feature extraction
- Fisher linear discriminant analysis (FLD or LDA)
- Kernel methods
- Principal component analysis (PCA)