首页 » 文章 » 文章详细信息
Journal of Electrical and Computer Engineering Volume 2019 ,2019-01-15
An Experiment on the Use of Genetic Algorithms for Topology Selection in Deep Learning
Research Article
Fernando Mattioli 1 Daniel Caetano 1 Alexandre Cardoso 1 Eduardo Naves 1 Edgard Lamounier 1
Show affiliations
DOI:10.1155/2019/3217542
Received 2018-06-01, accepted for publication 2018-12-03, Published 2018-12-03
PDF
摘要

The choice of a good topology for a deep neural network is a complex task, essential for any deep learning project. This task normally demands knowledge from previous experience, as the higher amount of required computational resources makes trial and error approaches prohibitive. Evolutionary computation algorithms have shown success in many domains, by guiding the exploration of complex solution spaces in the direction of the best solutions, with minimal human intervention. In this sense, this work presents the use of genetic algorithms in deep neural networks topology selection. The evaluated algorithms were able to find competitive topologies while spending less computational resources when compared to state-of-the-art methods.

授权许可

Copyright © 2019 Fernando Mattioli et al. 2019
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

通讯作者

Fernando Mattioli.Faculty of Electrical Engineering, Federal University of Uberlândia, Uberlândia, MG, Brazil, ufu.br.mattioli.fernando@gmail.com

推荐引用方式

Fernando Mattioli,Daniel Caetano,Alexandre Cardoso,Eduardo Naves,Edgard Lamounier. An Experiment on the Use of Genetic Algorithms for Topology Selection in Deep Learning. Journal of Electrical and Computer Engineering ,Vol.2019(2019)

您觉得这篇文章对您有帮助吗?
分享和收藏
0

是否收藏?

参考文献
[1] J. Patterson, A. Gibson. (2017). Deep Learning: A Practitioner’s Approach.1. DOI: 10.1038/nature14539.
[2] S. Huang, X. Li, Z. Cheng, Z. Zhang. et al.(2018). Gnas: a greedy neural architecture search method for multi-attribute learning. . DOI: 10.1038/nature14539.
[3] M. Mitchell. (1998). An Introduction to Genetic Algorithms. DOI: 10.1038/nature14539.
[4] S. Dutta, E. Gros. Evaluation of the impact of deep learning architectural components selection and dataset size on a medical imaging task. .10579. DOI: 10.1038/nature14539.
[5] Y. Jia, E. Shelhamer, J. Donahue. (2014). Caffe: convolutional architecture for fast feature embedding. . DOI: 10.1038/nature14539.
[6] S. C. Smithson, G. Yang, W. J. Gross, B. H. Meyer. et al.Neural networks designing neural networks: multi-objective hyper-parameter optimization. :1-8. DOI: 10.1038/nature14539.
[7] A. L. Rincon, A. Tonda, M. Elati, O. Schwander. et al.(2018). Evolutionary optimization of convolutional neural networks for cancer mirna biomarkers classification. Applied Soft Computing.65:91-100. DOI: 10.1038/nature14539.
[8] A. Krizhevsky, G. Hinton. (2009). Learning multiple layers of features from tiny images. . DOI: 10.1038/nature14539.
[9] J. Safarik, J. Jalowiczor, E. Gresak, J. Rozhon. et al.Genetic algorithm for automatic tuning of neural network hyperparameters. .10643. DOI: 10.1038/nature14539.
[10] D. P. Kingma, J. Ba. (2014). Adam: a method for stochastic optimization. . DOI: 10.1038/nature14539.
[11] Y. Lecun, L. Bottou, Y. Bengio, P. Haffner. et al.(1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE.86(11):2278-2324. DOI: 10.1038/nature14539.
[12] B. Baker, O. Gupta, N. Naik, R. Raskar. et al.(2016). Designing neural network architectures using reinforcement learning. . DOI: 10.1038/nature14539.
[13] F. H. F. Leung, H. K. Lam, S. H. Ling, P. K. S. Tam. et al.(2003). Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Transactions on Neural Networks.14(1):79-88. DOI: 10.1038/nature14539.
[14] K. O. Stanley, R. Miikkulainen. (2002). Evolving neural networks through augmenting topologies. Evolutionary Computation.10(2):99-127. DOI: 10.1038/nature14539.
[15] I. Goodfellow, Y. Bengio, A. Courville. (2016). Deep Learning. DOI: 10.1038/nature14539.
[16] Y. LeCun, Y. Bengio, G. Hinton. (2015). Deep learning. Nature.521(7553):436-444. DOI: 10.1038/nature14539.
[17] F. Assunção, N. Lourenço, P. Machado, B. Ribeiro. et al.Evolving the topology of large scale deep neural networks. :19-34. DOI: 10.1038/nature14539.
[18] J. Yang, Y.-Q. Zhao, J. C.-W. Chan. (2017). Learning and transferring deep joint spectral-spatial features for hyperspectral classification. IEEE Transactions on Geoscience and Remote Sensing.55(8):4729-4742. DOI: 10.1038/nature14539.
[19] J. H. Holland. (1992). Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. DOI: 10.1038/nature14539.
[20] I. Ullah, A. Petrosino. About pyramid structure in convolutional neural networks. :1318-1324. DOI: 10.1038/nature14539.
[21] S. Russel, P. Norvig. (2009). Artificial Intelligence: A Modern Approach. DOI: 10.1038/nature14539.
[22] M. Suganuma, S. Shirakawa, T. Nagao. A genetic programming approach to designing convolutional neural network architectures. :497-504. DOI: 10.1038/nature14539.
[23] J. Bergstra, Y. Bengio. (2012). Random search for hyper-parameter optimization. Journal of Machine Learning Research.13:281-305. DOI: 10.1038/nature14539.
文献评价指标
浏览 22次
下载全文 2次
评分次数 0次
用户评分 0.0分
分享 0次