Optimization Neural Networks Architecture using Genetic Algorithm and Evolutionary Computing

  • Manaaf Abdulredha Yassen College of Computer Science and Information Technology, University of Al-Qadisiyah, Iraq
  • Hayder Salah Abdulameer College of Computer Science and Information Technology, University of Al-Qadisiyah, Iraq
Keywords: Neural Architecture Search, Genetic Algorithms, Evolutionary Computing, Deep Learning Optimization, Model Complexity, AutoML.

Abstract

GA-based system is introduced in this research, used for optimizing the design of neural networks when traditional optimizing methods fail due to the high dimensionality of configuration spaces. Using the framework, parameters including the number of layers, neurons per layer, activation methods and regularization are represented in a fixed-length chromosome, allowing evolutionary algorithms to control the changes in the architecture through many generations. A composite fitness function is used which considers precision as much as the number of parameters, supporting the development of architectures that function well across different data. I ran my experiments with the three popular datasets called MNIST, Fashion-MNIST and CIFAR-10. It was shown that manually constructed models and those randomly searched failed to come close to the performance of our GA-optimized models which achieved 96.1% validation accuracy and reduced the number of parameters to only 180,000. When compared to existing NAS methods, our suggested approach is better at using parameters and is more reliable, since its performance is more consistent across several training trials. Furthermore, it was clear that these architectures could generalize well and transfer knowledge from one dataset to another. The results suggest that Genetic Algorithms are effective for automated neural architecture search and can be smoothly incorporated into both resource-sensitive and hybrid AutoML approaches.

References

[1] Dastres, R., & Soori, M. (2021). Artificial neural network systems. International Journal of Imaging and Robotics (IJIR), 21(2), 13-25.‏
[2] Chiroma, H., Noor, A. S. M., Abdulkareem, S., Abubakar, A. I., Hermawan, A., Qin, H., ... & Herawan, T. (2017). Neural networks optimization through genetic algorithm searches: a review. Appl. Math. Inf. Sci, 11(6), 1543-1564.‏
[3] De Campos, L. M. L., de Oliveira, R. C. L., & Roisenberg, M. (2016). Optimization of neural networks through grammatical evolution and a genetic algorithm. Expert Systems with Applications, 56, 368-384.‏
[4] Asadi, E., Da Silva, M. G., Antunes, C. H., Dias, L., & Glicksman, L. (2014). Multi-objective optimization for building retrofit: A model using genetic algorithm and artificial neural network and an application. Energy and buildings, 81, 444-456.‏
[5] Domashova, J. V., Emtseva, S. S., Fail, V. S., & Gridin, A. S. (2021). Selecting an optimal architecture of neural network using genetic algorithm. Procedia Computer Science, 190, 263-273.‏
[6] Sexton, R. S., Dorsey, R. E., & Johnson, J. D. (1999). Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing. European Journal of Operational Research, 114(3), 589-601.‏
[7] Fiszelew, A., Britos, P., Ochoa, A., Merlino, H., Fernández, E., & García-Martínez, R. (2007). Finding optimal neural network architecture using genetic algorithms. Advances in computer science and engineering research in computing science, 27, 15-24.‏
[8] T. Elsken, J. H. Metzen, and F. Hutter, “Neural Architecture Search: A Survey,” J. Mach. Learn. Res., vol. 20, no. 55, pp. 1–21, 2019.
[9] C. White et al., “Neural Architecture Search: Insights from 1000 Papers,” arXiv preprint arXiv:2301.08727, 2023.
[10] P. Ren et al., “A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions,” arXiv preprint arXiv:2006.02903, 2020.
[11] Y. Liu et al., “A Survey on Evolutionary Neural Architecture Search,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 3, pp. 1243–1261, 2021.
[12] P. Koehn, “Combining Genetic Algorithms and Neural Networks: The Encoding Problem,” M.S. thesis, Univ. Tennessee, 1994.
[13] H. Pham et al., “Efficient Neural Architecture Search via Parameter Sharing,” in Proc. Int. Conf. Mach. Learn., 2018, pp. 4095–4104.
[14] M. Suganuma, S. Shirakawa, and T. Nagao, “A Genetic Programming Approach to Designing Convolutional Neural Network Architectures,” in Proc. Genet. Evol. Comput. Conf., 2017, pp. 497–504.
[15] B. Zoph and Q. V. Le, “Neural Architecture Search with Reinforcement Learning,” Int. Conf. Learn. Represent., 2017.
[16] E. Real et al., “Regularized Evolution for Image Classifier Architecture Search,” in Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 4780–4789, 2019.
[17] H. Liu, K. Simonyan, and Y. Yang, “DARTS: Differentiable Architecture Search,” Int. Conf. Learn. Represent., 2019.
[18] E. Real, A. Aggarwal, Y. Huang, and Q. V. Le, “Regularized Evolution for Image Classifier Architecture Search,” in Proc. AAAI Conf. Artif. Intell., vol. 33, no. 1, pp. 4780–4789, 2019.
[19] Y. Liu, Y. Sun, B. Xue, M. Zhang, and G. G. Yen, “A Survey on Evolutionary Neural Architecture Search,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 3, pp. 1243–1261, 2021
Published
2025-07-02
How to Cite
Yassen, M. A., & Abdulameer, H. S. (2025). Optimization Neural Networks Architecture using Genetic Algorithm and Evolutionary Computing. CENTRAL ASIAN JOURNAL OF MATHEMATICAL THEORY AND COMPUTER SCIENCES, 6(3), 714-723. Retrieved from https://cajmtcs.centralasianstudies.org/index.php/CAJMTCS/article/view/796
Section
Articles