Literature DB >> 17131664

A simplified dual neural network for quadratic programming with its KWTA application.

Shubao Liu1, Jun Wang.   

Abstract

The design, analysis, and application of a new recurrent neural network for quadratic programming, called simplified dual neural network, are discussed. The analysis mainly concentrates on the convergence property and the computational complexity of the neural network. The simplified dual neural network is shown to be globally convergent to the exact optimal solution. The complexity of the neural network architecture is reduced with the number of neurons equal to the number of inequality constraints. Its application to k-winners-take-all (KWTA) operation is discussed to demonstrate how to solve problems with this neural network.

Mesh:

Year:  2006        PMID: 17131664     DOI: 10.1109/TNN.2006.881046

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  Convergence and rate analysis of neural networks for sparse approximation.

Authors:  Aurèle Balavoine; Justin Romberg; Christopher J Rozell
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-06-28       Impact factor: 10.451

2.  The general critical analysis for continuous-time UPPAM recurrent neural networks.

Authors:  Chen Qiao; Wen-Feng Jing; Jian Fang; Yu-Ping Wang
Journal:  Neurocomputing       Date:  2016-01-29       Impact factor: 5.719

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.