Optimization of Neural Networks using Deep Ge-netic Network Algorithm
Siddhartha Dhar Choudhury1, Kunal Mehrotra2, Shashank Pandey3, Christhu raj4, Rajeev Sukumaran5

1Siddhartha dhar Choudry*, Computer Science and Engineering, SRM Institute of Science and Technology, Chennai, India.
2Kunal Mehrota, Shashank Pandey, Christhu Raj, Computer Science and Engineering, SRM Institute of Science and Technology, Chennai, India.
3Shashank Pandey, Computer Science and Engineering, SRM Institute of Science and Technology, Chennai, India.
4Christhu raj, Computer Science and Engineering, SRM Institute of Science and Technology, Chennai, India.
5Rajeev Sukumaran, Teaching Learning Centre, Indian Institute of Technology Madras, Chennai, India.
Manuscript received on September 23, 2019. | Revised Manuscript received on October 15, 2019. | Manuscript published on October 30, 2019. | PP: 6494-6499 | Volume-9 Issue-1, October 2019 | Retrieval Number: A1128109119/2019©BEIESP | DOI: 10.35940/ijeat.A1128.109119
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The optimization of performance of a neural net-work is a time taking and tedious process, this iterative and con-tinuous process has no definite solution that works well for every possible use case. To tackle this problem we propose an architec-ture of neural networks called “Deep Genetic Network”, which can help in automatic selection of hyper parameter values based on fitness measures during training of the network. The algo-rithm is a confluence of deep neural networks and genetic algo-rithm. The problem of optimizing a neural network can be classi-fied into – Architecture and Hyperparameter optimization. A va-riety of algorithms have been proposed to solve this issue. Our approach uses concepts of mutation and mating (from genetic algorithms) for helping the neural net in finding the optimal set of hyperparameter values during training without requiring any manually setting the values in an iterative trial and error ap-proach. The architecture that we propose here works well in op-timization of hyperparameter values in convolutional, recurrent and affine layers. The usage of genetic algorithms for resolving this issue has worked well given adequate training time and com-putational resources.
Keywords: Hyperparameter optimization, Neural Networks, Neural Network Optimization, Genetic Algorithms.