Abstract
Feature selection (FS) is an essential technique widely applied in data mining. Recent studies have shown that evolutionary computing (EC) is very promising for FS due to its powerful search capability. However, most existing EC-based FS methods use a length-fixed encoding to represent feature subsets. This inflexible encoding turns ineffective when high-dimension data are handled, because it results in a huge search space, as well as a large amount of training time and memory overhead. In this article, we propose a length-adaptive genetic algorithm with Markov blanket (LAGAM), which adopts a length-variable individual encoding and enables individuals to evolve in their own search space. In LAGAM, features are rearranged decreasingly based on their relevance, and an adaptive length changing operator is introduced, which extends or shortens an individual to guide it to explore in a better search space. Local search based on Markov blanket (MB) is embedded to further improve individuals. Experiments are conducted on 12 high-dimensional datasets and results reveal that LAGAM performs better than existing methods. Specifically, it achieves a higher classification accuracy by using fewer features.
Original language | English (US) |
---|---|
Pages (from-to) | 1-12 |
Number of pages | 12 |
Journal | IEEE Transactions on Cybernetics |
DOIs | |
State | Accepted/In press - 2022 |
All Science Journal Classification (ASJC) codes
- Software
- Control and Systems Engineering
- Information Systems
- Human-Computer Interaction
- Computer Science Applications
- Electrical and Electronic Engineering
Keywords
- Classification
- Correlation
- Encoding
- Feature extraction
- Genetic algorithms
- Indexes
- Markov blanket (MB)
- Markov processes
- Training
- feature selection (FS)
- genetic algorithms (GAs)
- high-dimensional data
- length-adaptive
- machine learning