ABSTRACT
Feature selection is the process of finding the set of inputs to a machine learning algorithm that will yield the best performance. Developing a way to solve this problem automatically would make current machine learning methods much more useful. Previous efforts to automate feature selection rely on expensive meta-learning or are applicable only when labeled training data is available. This paper presents a novel method called FS-NEAT which extends the NEAT neuroevolution method to automatically determine an appropriate set of inputs for the networks it evolves. By learning the network's inputs, topology, and weights simultaneously, FS-NEAT addresses the feature selection problem without relying on meta-learning or labeled data. Initial experiments in an autonomous car racing simulation demonstrate that FS-NEAT can learn better and faster than regular NEAT. In addition, the networks it evolves are smaller and require fewer inputs. Furthermore, FS-NEAT's performance remains robust even as the feature selection task it faces is made increasingly difficult.
- B. V. Bonnlander and A. S. Weigend. Selecting input variables using mutual information and nonparametric density estimation. In Proceedings of the 1994 International Symposium on Artificial Neural Networks (ISANN'94), pages 42--50, Tainan, Taiwan, 1994.Google Scholar
- D. E. Goldberg and J. Richardson. Genetic algorithms with sharing for multimodal function optimization. In Proceedings of the Second International Conference on Genetic Algorithms, pages 148--154, 1987. Google ScholarDigital Library
- F. Gruau, D. Whitley, and L. Pyeatt. A comparison between cellular encoding and direct encoding for genetic neural networks. In J. R. Koza, D. E. Goldberg, D. B. Fogel, and R. L. Riolo, editors, Genetic Programming 1996: Proceedings of the First Annual Conference, pages 81--89. MIT Press, 1996. Google ScholarDigital Library
- P. R. Harvey, D. M. Booth, and J. F. Boyce. Evolving the mapping between input neurons and multi-source imagery. In Proceedings of the 2002 Congress on Evolutionary Computation, pages 1878--1883, 2002. Google ScholarDigital Library
- K. Kira and L. Rendell. A practical approach to feature selection. In Proceedings of the Tenth International Conference on Machine Learning, Amherst, Massachusetts, 1992. Morgan Kaufmann. Google ScholarDigital Library
- P. Langley. Selection of relevant features in machine learning. In Proceedings of AAAI Fall Symposium on Relevance, 1994.Google ScholarCross Ref
- P. M. Narendra and K. Fukunaga. A branch and bound algorithm for feature subset selection. IEEE Transactions on Computers, 26:917--922, 1977.Google ScholarDigital Library
- J. Novovivova, P. Pudil, and J. Kittler. Floating search methods in feature selection. Pattern Recognition Letters, 15:1119--1125, 1994. Google ScholarDigital Library
- N. J. Radcliffe. Genetic set recombination and its application to neural network topology optimization. Neural computing and applications, 1(1):67--90, 1993.Google Scholar
- K. O. Stanley and R. Miikkulainen. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99--127, 2002. Google ScholarDigital Library
- K. O. Stanley and R. Miikkulainen. Competitive coevolution through evolutionary complexification. Journal of Artificial Intelligence Research, 21, 2004. In press. Google ScholarDigital Library
- M. E. Timin. The robot auto racing simulator, 1995. http://rars.sourceforge.net.Google Scholar
- S. Whiteson, K. O. Stanley, and R. Miikkulainen. Automatic feature selection in neuroevolution. In GECCO 2004: Proceedings of the Genetic and Evolutionary Computation Conference Workshop on Self-Organization, July 2004. Google ScholarDigital Library
- X. Yao. Evolving artificial neural networks. Proceedings of the IEEE, 87(9):1423--1447, 1999.Google ScholarCross Ref
Index Terms
- Automatic feature selection in neuroevolution
Recommendations
Normalized mutual information feature selection
A filter method of feature selection based on mutual information, called normalized mutual information feature selection (NMIFS), is presented. NMIFS is an enhancement over Battiti's MIFS, MIFS-U, and mRMR methods. The average normalized mutual ...
Feature selection using genetic algorithm and cluster validation
Feature selection plays an important role in image retrieval systems. The better selection of features usually results in higher retrieval accuracy. This work tries to select the best feature set from a total of 78 low level image features, including ...
Evolutionary feature selection: a novel wrapper feature selection architecture based on evolutionary strategies
GECCO '22: Proceedings of the Genetic and Evolutionary Computation ConferenceFeature selection is an approach to selecting the best set of features from a feature pool. Its goal is to increase the performance of the machine learning model by providing sufficient information while avoiding redundant or irrelevant features. Due to ...
Comments