In pattern classification, Principle Component Analysis (PCA) and Linear Discriminate Analysis (LDA) are commonly used to reduce the dimensionality of input feature space. However, there exist some problems such that how many eigen vectors are needed to be the most effective in the transformation map as well as the lack of optimal separability in low dimensional data. In this paper, we present a new distance-based separator representation to solve these problems. The representation frame structure keeps adjustment pertaining to the problem complexity, and its dimensionality corresponds to the number of classes. Experimental results show that the new representation outperforms the PCA and LDA representations in multi-class classification and low-dimensional classification.
All Science Journal Classification (ASJC) codes
- Signal Processing
- Computer Vision and Pattern Recognition
- Pattern representation
- Support vector machine