Literature DB >> 15940989

Mutation-based genetic neural network.

Paulito P Palmes1, Taichi Hayasaka, Shiro Usui.   

Abstract

Evolving gradient-learning artificial neural networks (ANNs) using an evolutionary algorithm (EA) is a popular approach to address the local optima and design problems of ANN. The typical approach is to combine the strength of backpropagation (BP) in weight learning and EA's capability of searching the architecture space. However, the BP's "gradient descent" approach requires a highly computer-intensive operation that relatively restricts the search coverage of EA by compelling it to use a small population size. To address this problem, we utilized mutation-based genetic neural network (MGNN) to replace BP by using the mutation strategy of local adaptation of evolutionary programming (EP) to effect weight learning. The MGNN's mutation enables the network to dynamically evolve its structure and adapt its weights at the same time. Moreover, MGNN's EP-based encoding scheme allows for a flexible and less restricted formulation of the fitness function and makes fitness computation fast and efficient. This makes it feasible to use larger population sizes and allows MGNN to have a relatively wide search coverage of the architecture space. MGNN implements a stopping criterion where overfitness occurrences are monitored through "sliding-windows" to avoid premature learning and overlearning. Statistical analysis of its performance to some well-known classification problems demonstrate its good generalization capability. It also reveals that locally adapting or scheduling the strategy parameters embedded in each individual network may provide a proper balance between the local and global searching capabilities of MGNN.

Entities:  

Mesh:

Year:  2005        PMID: 15940989     DOI: 10.1109/TNN.2005.844858

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  3 in total

1.  A Novel Genetic Neural Network Algorithm with Link Switches and Its Application in University Professional Course Evaluation.

Authors:  Honghai Ji; Jinyao Zhou; Shida Liu; Li Wang; Lingling Fan
Journal:  Comput Intell Neurosci       Date:  2022-05-24

2.  A New Initialization Approach in Particle Swarm Optimization for Global Optimization Problems.

Authors:  Waqas Haider Bangyal; Abdul Hameed; Wael Alosaimi; Hashem Alyami
Journal:  Comput Intell Neurosci       Date:  2021-05-17

3.  Optimization of Deep Neural Networks Using SoCs with OpenCL.

Authors:  Rafael Gadea-Gironés; Ricardo Colom-Palero; Vicente Herrero-Bosch
Journal:  Sensors (Basel)       Date:  2018-04-30       Impact factor: 3.576

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.