A New Hybrid Grasshopper Optimization - Backpropagation for Feedforward Neural Network Training

Authors

  • Samer Alsammarraie
  • Nazar K. Hussein

DOI:

https://doi.org/10.25130/tjps.v25i1.221

Keywords:

artificial neural network, Grasshopper optimization algorithm, backpropagation algorithm, optimization

Abstract

The Grasshopper optimization algorithm showed a rapid converge in the initial phases of the global search, however while being around the global optimum, the searching process became so slow. On the contrary, the gradient descending method around achieved faster convergent speed global optimum, and the convergent accuracy was showed to be higher at the same time. As a result, the proposed hybrid algorithm combined Grasshopper optimization algorithm (GOA) along with the back-propagation (BP) algorithm, also referred to as GOA–BP algorithm, was introduced to provide training to the weights of the feed forward neural network (FNN), the proposed hybrid algorithm can utilize the strong global searching ability of the GOA, and the intense local searching ability of the Back-Propagation algorithm. The results of experiments showed that the proposed hybrid GOA–BP algorithm was better and faster in convergent speed and accuracy than the Grasshopper optimization algorithm (GOA) and BP algorithm.

Downloads

Published

2020-02-02 — Updated on 2023-02-04

Versions

How to Cite

Samer Alsammarraie, & Nazar K. Hussein. (2023). A New Hybrid Grasshopper Optimization - Backpropagation for Feedforward Neural Network Training. Tikrit Journal of Pure Science, 25(1), 118–127. https://doi.org/10.25130/tjps.v25i1.221 (Original work published February 2, 2020)

Issue

Section

Articles

Categories

Similar Articles

1 2 3 4 5 6 7 8 > >> 

You may also start an advanced similarity search for this article.