Literature DB >> 25330430

Ordinal neural networks without iterative tuning.

Francisco Fernández-Navarro, Annalisa Riccardi, Sante Carloni.   

Abstract

Ordinal regression (OR) is an important branch of supervised learning in between the multiclass classification and regression. In this paper, the traditional classification scheme of neural network is adapted to learn ordinal ranks. The model proposed imposes monotonicity constraints on the weights connecting the hidden layer with the output layer. To do so, the weights are transcribed using padding variables. This reformulation leads to the so-called inequality constrained least squares (ICLS) problem. Its numerical solution can be obtained by several iterative methods, for example, trust region or line search algorithms. In this proposal, the optimum is determined analytically according to the closed-form solution of the ICLS problem estimated from the Karush-Kuhn-Tucker conditions. Furthermore, following the guidelines of the extreme learning machine framework, the weights connecting the input and the hidden layers are randomly generated, so the final model estimates all its parameters without iterative tuning. The model proposed achieves competitive performance compared with the state-of-the-art neural networks methods for OR.

Mesh:

Year:  2014        PMID: 25330430     DOI: 10.1109/TNNLS.2014.2304976

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  2 in total

1.  Hierarchical Denoising of Ordinal Time Series of Clinical Scores.

Authors:  Jonathan Koss; Sule Tinaz; Hemant D Tagare
Journal:  IEEE J Biomed Health Inform       Date:  2022-07-01       Impact factor: 7.021

2.  Multiple Ordinal Regression by Maximizing the Sum of Margins.

Authors:  Onur C Hamsici; Aleix M Martinez
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2015-10-27       Impact factor: 10.451

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.