Full Text:   <1410>

Summary:  <1263>

CLC number: TP18

On-line Access: 2021-06-21

Received: 2020-08-25

Revision Accepted: 2020-12-21

Crosschecked: 2021-05-20

Cited: 0

Clicked: 2170

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Saeid Nikbakht

https://orcid.org/0000-0003-2398-3962

Timon Rabczuk

https://orcid.org/0000-0002-7150-296X

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE A 2021 Vol.22 No.6 P.407-426

http://doi.org/10.1631/jzus.A2000384


Optimizing the neural network hyperparameters utilizing genetic algorithm


Author(s):  Saeid Nikbakht, Cosmin Anitescu, Timon Rabczuk

Affiliation(s):  Division of Computational Mechanics, Ton Duc Thang University, Ho Chi Minh City, Vietnam; more

Corresponding email(s):   timon.rabczuk@tdtu.edu.vn

Key Words:  Machine learning, Neural network (NN), Hyperparameters, Genetic algorithm


Share this article to: More |Next Article >>>

Saeid Nikbakht, Cosmin Anitescu, Timon Rabczuk. Optimizing the neural network hyperparameters utilizing genetic algorithm[J]. Journal of Zhejiang University Science A, 2021, 22(6): 407-426.

@article{title="Optimizing the neural network hyperparameters utilizing genetic algorithm",
author="Saeid Nikbakht, Cosmin Anitescu, Timon Rabczuk",
journal="Journal of Zhejiang University Science A",
volume="22",
number="6",
pages="407-426",
year="2021",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.A2000384"
}

%0 Journal Article
%T Optimizing the neural network hyperparameters utilizing genetic algorithm
%A Saeid Nikbakht
%A Cosmin Anitescu
%A Timon Rabczuk
%J Journal of Zhejiang University SCIENCE A
%V 22
%N 6
%P 407-426
%@ 1673-565X
%D 2021
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.A2000384

TY - JOUR
T1 - Optimizing the neural network hyperparameters utilizing genetic algorithm
A1 - Saeid Nikbakht
A1 - Cosmin Anitescu
A1 - Timon Rabczuk
J0 - Journal of Zhejiang University Science A
VL - 22
IS - 6
SP - 407
EP - 426
%@ 1673-565X
Y1 - 2021
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.A2000384


Abstract: 
Neural networks (NNs), as one of the most robust and efficient machine learning methods, have been commonly used in solving several problems. However, choosing proper hyperparameters (e.g. the numbers of layers and neurons in each layer) has a significant influence on the accuracy of these methods. Therefore, a considerable number of studies have been carried out to optimize the NN hyperparameters. In this study, the genetic algorithm is applied to NN to find the optimal hyperparameters. Thus, the deep energy method, which contains a deep neural network, is applied first on a Timoshenko beam and a plate with a hole. Subsequently, the numbers of hidden layers, integration points, and neurons in each layer are optimized to reach the highest accuracy to predict the stress distribution through these structures. Thus, applying the proper optimization method on NN leads to significant increase in the NN prediction accuracy after conducting the optimization in various examples.

利用遗传算法优化神经网络超参数

目的:证明超参数优化对深度能量方法(DEM)精度的影响以及DEM在预测不同荷载作用下梁和板等结构的应力分布方面的能力.
创新点:1. 为了提高DEM的准确性,各种超参数组合被输入遗传算法(GA)并找到最佳组合.2. 为了防止重复计算以及提高这种元启发式算法的效率,GA过程中还考虑了超参数组合的禁忌列表.
方法:1. 实施非均匀有理样条(NURBS)以生成穿过结构体和边界的积分点.2. 采用DEM计算位移和应力分布.3. 利用遗传算法优化DEM的超参数,以对模型在预测结构内应力和位移传播的准确性方面具有显着影响.
结论:1. 在不同的优化器和激活函数中,Adam和L-BFGS-B方法以及ReLU2函数的组合使得DEM模型的准确率最高.2. 其他对模型预测准确性有影响的超参数包括隐藏层的数量、每层神经元的数量以及通过上述结构集成的点数.3. 优化DEM的超参数可以使相对应变能误差降低近50%,提高了DEM模型对应力和位移分布的预测能力.

关键词:机器学习;神经网络;超参数;遗传算法

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Alhichri H, Alajlan N, Bazi Y, et al., 2018. Multi-scale convolutional neural network for remote sensing scene classification. IEEE International Conference on Electro/Information Technology, p.1-5.

[2]Anitescu C, Hossain MN, Rabczuk T, 2018. Recovery-based error estimation and adaptivity using high-order splines over hierarchical T-meshes. Computer Methods in Applied Mechanics and Engineering, 328:638-662.

[3]Augarde CE, Deeks AJ, 2008. The use of Timoshenko’s exact solution for a cantilever beam in adaptive analysis. Finite Elements in Analysis and Design, 44(9-10):595-601.

[4]Bacanin N, Bezdan T, Tuba E, et al., 2020. Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics. Algorithms, 13(3):67.

[5]Bani-Hani D, Khan N, Alsultan F, et al., 2018. Classification of leucocytes using convolutional neural network optimized through genetic algorithm. Proceedings of the 7th Annual World Conference of the Society for Industrial and Systems Engineering.

[6]Bergstra J, Bengio Y, 2012. Random search for hyper-parameter optimization. The Journal of Machine Learning Research, 13(1):281-305.

[7]Dalto M, Matuško J, Vašak M, 2015. Deep neural networks for ultra-short-term wind forecasting. IEEE International Conference on Industrial Technology, p.1657-1663.

[8]Goswami S, Anitescu C, Chakraborty S, et al., 2020. Transfer learning enhanced physics informed neural network for phase-field modeling of fracture. Theoretical and Applied Fracture Mechanics, 106:102447.

[9]Guo BS, Hu JW, Wu WW, et al., 2019. The Tabu_genetic algorithm: a novel method for hyper-parameter optimization of learning algorithms. Electronics, 8(5):579.

[10]Jo Y, Min K, Jung D, et al., 2019. Comparative study of the artificial neural network with three hyper-parameter optimization methods for the precise LP-EGR estimation using in-cylinder pressure in a turbocharged GDI engine. Applied Thermal Engineering, 149:1324-1334.

[11]Junior FEF, Yen GG, 2019. Particle swarm optimization of deep neural networks architectures for image classification. Swarm and Evolutionary Computation, 49:62-74.

[12]Kanada Y, 2016. Optimizing neural-network learning rate by using a genetic algorithm with per-epoch mutations. International Joint Conference on Neural Networks, p.1472-1479.

[13]Kaur S, Aggarwal H, Rani R, 2020. Hyper-parameter optimization of deep learning model for prediction of Parkinson’s disease. Machine Vision and Applications, 31(5):32.

[14]Liashchynskyi P, Liashchynskyi P, 2019. Grid search, random search, genetic algorithm: a big comparison for NAS. https://arxiv.org/abs/1912.06059

[15]Loussaief S, Abdelkrim A, 2018. Convolutional neural network hyper-parameters optimization based on genetic algorithms. International Journal of Advanced Computer Science and Applications, 9(10):252-266.

[16]Motta D, Santos AÁB, Machado BAS, et al., 2020. Optimization of convolutional neural network hyperparameters for automatic classification of adult mosquitoes. PLoS One, 15(7):e0234959.

[17]Najafi B, Faizollahzadeh Ardabili S, Mosavi A, et al., 2018. An intelligent artificial neural network-response surface methodology method for accessing the optimum biodiesel and diesel fuel blending conditions in a diesel engine from the viewpoint of exergy and energy analysis. Energies, 11(4):860.

[18]Nassif AB, Shahin I, Attili I, et al., 2019. Speech recognition using deep neural networks: a systematic review. IEEE Access, 7:19143-19165.

[19]Nguyen-Thanh VM, Nguyen LTK, Rabczuk T, et al., 2019. A surrogate model for computational homogenization of elastostatics at finite strain using the HDMR-based neural network approximator. https://arxiv.org/abs/1906.02005

[20]Samaniego E, Anitescu C, Goswami S, et al., 2020. An energy approach to the solution of partial differential equations in computational mechanics via machine learning: concepts, implementation and applications. Computer Methods in Applied Mechanics and Engineering, 362:112790.

[21]Shamshirband S, Mosavi A, Rabczuk T, et al., 2020. Prediction of significant wave height; comparison between nested grid numerical model, and machine learning models of artificial neural networks, extreme learning and support vector machines. Engineering Applications of Computational Fluid Mechanics, 14(1):805-817.

[22]Torres JF, Gutiérrez-Avilés D, Troncoso A, et al., 2019. Random hyper-parameter search-based deep neural network for power consumption forecasting. International Work-Conference on Artificial Neural Networks, p.259-269.

[23]ul Hassan M, Sabar NR, Song A, 2018. Optimising deep learning by hyper-heuristic approach for classifying good quality images. International Conference on Computational Science, p.528-539.

[24]Wei X, You ZN, 2019. Neural network hyperparameter tuning based on improved genetic algorithm. Proceedings of the 8th International Conference on Computing and Pattern Recognition, p.17-24.

[25]Wicaksono AS, Supianto AA, 2018. Hyper parameter optimization using genetic algorithm on machine learning methods for online news popularity prediction. International Journal of Advanced Computer Science and Applications, 9(12):263-267.

[26]Yu T, Zhu H, 2020. Hyper-parameter optimization: a review of algorithms and applications. https://arxiv.org/abs/2003.05689

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2022 Journal of Zhejiang University-SCIENCE