ABSTRACT
Surrogate models have been used for decades to speed up evolutionary algorithms, however, most of their uses are tailored for problems with simple individual encoding, like vectors of numbers. In this paper, we evaluate the possibility to use two different types of graph neural networks to predict the quality of a solution in tree-based genetic programming without evaluating the trees. The proposed models are evaluated in a number of benchmarks from symbolic regression and reinforcement learning and show that GNNs can be successfully used as surrogate models for problems with a complex structure.
Supplemental Material
Available for Download
Supplemental material.
- Greg Brockman, Vicki Cheung, Ludwig Pettersson, Jonas Schneider, John Schulman, Jie Tang, and Wojciech Zaremba. 2016. OpenAI Gym. arXiv:arXiv:1606.01540Google Scholar
- Torsten Hildebrandt and Jürgen Branke. 2015. On Using Surrogates with Genetic Programming. Evolutionary Computation 23, 3 (Sept. 2015), 343--367. Google ScholarDigital Library
- Martin Pilát and Roman Neruda. 2016. Feature Extraction for Surrogate Models in Genetic Programming. In Parallel Problem Solving from Nature - PPSN XIV, Julia Handl, Emma Hart, Peter R. Lewis, Manuel López-Ibáñez, Gabriela Ochoa, and Ben Paechter (Eds.). Springer International Publishing, Cham, 335--344.Google Scholar
- Riccardo Poli, William B. Langdon, and Nicholas Freitag McPhee. 2008. A field guide to genetic programming. Published via http://lulu.com and freely available at http://www.gp-field-guide.org.uk.Google Scholar
- Kai Sheng Tai, Richard Socher, and Christopher D. Manning. 2015. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks. arXiv:1503.00075 [cs.CL]Google Scholar
- David R White, James McDermott, Mauro Castelli, Luca Manzoni, Brian W Goldman, Gabriel Kronberger, Wojciech Jaśkowski, Una-May O'Reilly, and Sean Luke. 2013. Better GP benchmarks: community survey results and proposals. Genet. Program. Evolvable Mach. 14, 1 (March 2013), 3--29.Google ScholarDigital Library
- Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In International Conference on Learning Representations. https://openreview.net/forum?id=ryGs6iA5KmGoogle Scholar
Index Terms
- Using graph neural networks as surrogate models in genetic programming
Recommendations
Recurrent Cartesian Genetic Programming of Artificial Neural Networks
Cartesian Genetic Programming of Artificial Neural Networks is a NeuroEvolutionary method based on Cartesian Genetic Programming. Cartesian Genetic Programming has recently been extended to allow recurrent connections. This work investigates applying ...
Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks
GECCO '13: Proceedings of the 15th annual conference on Genetic and evolutionary computationNeuroevolution, the application of evolutionary algorithms to artificial neural networks (ANNs), is well-established in machine learning. Cartesian Genetic Programming (CGP) is a graph-based form of Genetic Programming which can easily represent ANNs. ...
Neural network crossover in genetic algorithms using genetic programming
AbstractThe use of genetic algorithms (GAs) to evolve neural network (NN) weights has risen in popularity in recent years, particularly when used together with gradient descent as a mutation operator. However, crossover operators are often omitted from ...
Comments